This paper tackles the problem of spacecraft relative navigation to support the reach and capture of a passively cooperative space target using a chaser platform equipped with a robotic arm in the frame of future operations such as On Orbit Servicing and Active Debris Removal. Specifically, it presents a pose determination architecture based on monocular cameras to deal with a space target in GEO equipped with retro-reflective and black-and-white fiducial markers. The proposed architecture covers the entire processing pipeline, i.e., starting from markers’ detection and identification up to pose estimation by solving the Perspective-n-Points problem with a customized implementation of the Levenberg–Marquardt algorithm. It is designed to obtain relative position and attitude measurements of the target’s main body with respect to the chaser, as well as of the robotic arm’s end effector with respect to the selected grasping point. The design of the configuration of fiducial markers to be installed on the target’s approach face to support the pose determination task is also described. A performance assessment is carried out by means of numerical simulations using the Planet and Asteroid Natural Scene Generation Utility tool to produce realistic synthetic images of the target. The proposed approach robustness is evaluated against variable illumination scenarios and considering different uncertainty levels in the knowledge of initial conditions and camera intrinsic parameters.

Monocular-Based Pose Estimation Based on Fiducial Markers for Space Robotic Capture Operations in GEO / Opromolla, Roberto; Vela, Claudio; Nocerino, Alessia; Lombardi, Carlo. - In: REMOTE SENSING. - ISSN 2072-4292. - 14:18(2022), pp. 1-39. [10.3390/rs14184483]

Monocular-Based Pose Estimation Based on Fiducial Markers for Space Robotic Capture Operations in GEO

Roberto Opromolla;Claudio Vela;Alessia Nocerino;Carlo Lombardi
2022

Abstract

This paper tackles the problem of spacecraft relative navigation to support the reach and capture of a passively cooperative space target using a chaser platform equipped with a robotic arm in the frame of future operations such as On Orbit Servicing and Active Debris Removal. Specifically, it presents a pose determination architecture based on monocular cameras to deal with a space target in GEO equipped with retro-reflective and black-and-white fiducial markers. The proposed architecture covers the entire processing pipeline, i.e., starting from markers’ detection and identification up to pose estimation by solving the Perspective-n-Points problem with a customized implementation of the Levenberg–Marquardt algorithm. It is designed to obtain relative position and attitude measurements of the target’s main body with respect to the chaser, as well as of the robotic arm’s end effector with respect to the selected grasping point. The design of the configuration of fiducial markers to be installed on the target’s approach face to support the pose determination task is also described. A performance assessment is carried out by means of numerical simulations using the Planet and Asteroid Natural Scene Generation Utility tool to produce realistic synthetic images of the target. The proposed approach robustness is evaluated against variable illumination scenarios and considering different uncertainty levels in the knowledge of initial conditions and camera intrinsic parameters.
2022
Monocular-Based Pose Estimation Based on Fiducial Markers for Space Robotic Capture Operations in GEO / Opromolla, Roberto; Vela, Claudio; Nocerino, Alessia; Lombardi, Carlo. - In: REMOTE SENSING. - ISSN 2072-4292. - 14:18(2022), pp. 1-39. [10.3390/rs14184483]
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11588/895682
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 6
  • ???jsp.display-item.citation.isi??? 5
social impact