Missions requiring autonomous, close-proximity operations of spacecraft, such as On-Orbit Servicing, On-Orbit Assembly and Active Debris Removal, have become a thriving topic in the aerospace research community over the last decades, not only from an economic, operative, and scientific perspective, but also as a mean of ensuring the sustainability of the space environment. These operations involve a variety of technological challenges, most of which are related to the need of autonomous and safe Guidance, Navigation and Control systems. Since the future of these mission scenarios is strictly tied to spacecraft standardisation and modularity, relative navigation employing monocular cameras on servicing platforms to approach targets equipped with artificial markers for pose estimation purposes has drawn great attention. Following this trend, this paper presents an original vision-based pose estimation architecture for relative navigation with respect to passively cooperative targets equipped with ArUco markers. The proposed architecture foresees two operative modes, namely Acquisition and Tracking. The first features ArUco's detection through hue-saturation-value image representation, their identification by reading their built-in code and the computation of the pose without a-priori knowledge. The second, instead, takes advantage of prior pose estimates to speed up the entire processing pipeline. Performance is assessed through an extensive numerical simulation campaign, considering as test scenario the final approach phase of a rendezvous manoeuvre to reach a satellite belonging to a large constellation in Low Earth Orbit, and using the Planet and Asteroid Natural scene Generation Utility tool for realistic synthetic image generation. The dedicated tests on the Acquisition mode show that successful marker detection and pose initialization is achieved from up to 99.76% of the possible relative position and attitude states of the chaser with respect to the target at beginning of the final approach trajectory. As the chaser gets closer to the target, results highlight significant robustness of both operative modes against illumination conditions and uncertainties in the knowledge of camera intrinsic parameters. Overall, the architecture shows pose estimation accuracies up to millimetric and sub-degree levels.

Pose determination of passively cooperative spacecraft in close proximity using a monocular camera and AruCo markers / Vela, C.; Fasano, G.; Opromolla, R.. - In: ACTA ASTRONAUTICA. - ISSN 0094-5765. - 201:(2022), pp. 22-38. [10.1016/j.actaastro.2022.08.024]

Pose determination of passively cooperative spacecraft in close proximity using a monocular camera and AruCo markers

Vela C.;Fasano G.;Opromolla R.
2022

Abstract

Missions requiring autonomous, close-proximity operations of spacecraft, such as On-Orbit Servicing, On-Orbit Assembly and Active Debris Removal, have become a thriving topic in the aerospace research community over the last decades, not only from an economic, operative, and scientific perspective, but also as a mean of ensuring the sustainability of the space environment. These operations involve a variety of technological challenges, most of which are related to the need of autonomous and safe Guidance, Navigation and Control systems. Since the future of these mission scenarios is strictly tied to spacecraft standardisation and modularity, relative navigation employing monocular cameras on servicing platforms to approach targets equipped with artificial markers for pose estimation purposes has drawn great attention. Following this trend, this paper presents an original vision-based pose estimation architecture for relative navigation with respect to passively cooperative targets equipped with ArUco markers. The proposed architecture foresees two operative modes, namely Acquisition and Tracking. The first features ArUco's detection through hue-saturation-value image representation, their identification by reading their built-in code and the computation of the pose without a-priori knowledge. The second, instead, takes advantage of prior pose estimates to speed up the entire processing pipeline. Performance is assessed through an extensive numerical simulation campaign, considering as test scenario the final approach phase of a rendezvous manoeuvre to reach a satellite belonging to a large constellation in Low Earth Orbit, and using the Planet and Asteroid Natural scene Generation Utility tool for realistic synthetic image generation. The dedicated tests on the Acquisition mode show that successful marker detection and pose initialization is achieved from up to 99.76% of the possible relative position and attitude states of the chaser with respect to the target at beginning of the final approach trajectory. As the chaser gets closer to the target, results highlight significant robustness of both operative modes against illumination conditions and uncertainties in the knowledge of camera intrinsic parameters. Overall, the architecture shows pose estimation accuracies up to millimetric and sub-degree levels.
2022
Pose determination of passively cooperative spacecraft in close proximity using a monocular camera and AruCo markers / Vela, C.; Fasano, G.; Opromolla, R.. - In: ACTA ASTRONAUTICA. - ISSN 0094-5765. - 201:(2022), pp. 22-38. [10.1016/j.actaastro.2022.08.024]
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11588/893002
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 5
  • ???jsp.display-item.citation.isi??? 2
social impact