Advancements in Computer Vision (CV) and Machine Learning (ML) of past decades have contributed to the realization of autonomous systems like self-driving cars. This manuscript explores the possibility of transferring this technology to the next planetary exploratory missions. Similarly to a star tracker, it is possible to match a pattern of observed craters with a reference, i.e. a crater catalogue, in order to perform the spacecraft state estimation with no external support (i.e. GNSS or DSN). Such kind of technology, born for missilistic applications before the advent of GPS, is known as Terrain Relative Navigation (TRN). However, unlike stars, craters largely vary their appearances also depending on image qualities, lighting geometry and noises. While these problems can nowadays be overcome with the modern approach of deep learning, the inherent limit of crater detectors, i.e. the false detections, still poses a problem for the matching phase. In response, this paper proposes a novel solution, exploiting attitude and sensor pointing knowledge to discriminate false matches. A complete TRN system, called FederNet, was finally developed implementing the matching algorithm within a processing chain including a Convolutional Neural Network and an extended Kalman filter (EKF). FederNet has been validated with a numerical anlysis on real lunar elevation images. However, the adopted methodology further extends to other airless bodies. Despite the usage of a medium resolution (118 m/px) Digital Elevation Model (DEM), results showed that the navigation accuracy lie below 400 meters in the best case scenario, guaranteeing real time autonomous on-board operations with no need for ground support. The capabilities of such TRN system can be additionally improved with higher resolution data and data fusion integration with other sensor measurements.

A robust crater matching algorithm for autonomous vision-based spacecraft navigation / Del Prete, R.; Renga, A.. - (2021), pp. 322-327. (Intervento presentato al convegno 8th IEEE International Workshop on Metrology for AeroSpace, MetroAeroSpace 2021 nel 2021) [10.1109/MetroAeroSpace51421.2021.9511670].

A robust crater matching algorithm for autonomous vision-based spacecraft navigation

Del Prete R.;Renga A.
2021

Abstract

Advancements in Computer Vision (CV) and Machine Learning (ML) of past decades have contributed to the realization of autonomous systems like self-driving cars. This manuscript explores the possibility of transferring this technology to the next planetary exploratory missions. Similarly to a star tracker, it is possible to match a pattern of observed craters with a reference, i.e. a crater catalogue, in order to perform the spacecraft state estimation with no external support (i.e. GNSS or DSN). Such kind of technology, born for missilistic applications before the advent of GPS, is known as Terrain Relative Navigation (TRN). However, unlike stars, craters largely vary their appearances also depending on image qualities, lighting geometry and noises. While these problems can nowadays be overcome with the modern approach of deep learning, the inherent limit of crater detectors, i.e. the false detections, still poses a problem for the matching phase. In response, this paper proposes a novel solution, exploiting attitude and sensor pointing knowledge to discriminate false matches. A complete TRN system, called FederNet, was finally developed implementing the matching algorithm within a processing chain including a Convolutional Neural Network and an extended Kalman filter (EKF). FederNet has been validated with a numerical anlysis on real lunar elevation images. However, the adopted methodology further extends to other airless bodies. Despite the usage of a medium resolution (118 m/px) Digital Elevation Model (DEM), results showed that the navigation accuracy lie below 400 meters in the best case scenario, guaranteeing real time autonomous on-board operations with no need for ground support. The capabilities of such TRN system can be additionally improved with higher resolution data and data fusion integration with other sensor measurements.
2021
978-1-7281-7556-0
A robust crater matching algorithm for autonomous vision-based spacecraft navigation / Del Prete, R.; Renga, A.. - (2021), pp. 322-327. (Intervento presentato al convegno 8th IEEE International Workshop on Metrology for AeroSpace, MetroAeroSpace 2021 nel 2021) [10.1109/MetroAeroSpace51421.2021.9511670].
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11588/865903
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 6
  • ???jsp.display-item.citation.isi??? 4
social impact