To support the rapid development of the Urban Air Mobility framework, safe navigation must be ensured to Vertical Take-Off and Landing aircraft, especially in the approach and landing phases. Visual sensors have the potential of providing accurate measurements with reduced budgets, although integrity issues, as well as performance degradation in low visibility and highly dynamic environments, may pose challenges. In this context, this paper focuses on autonomous navigation during vertical approach and landing procedures and provides three main contributions. First, visual sensing requirements relevant to Urban Air Mobility scenarios are defined considering realistic landing trajectories, landing pad dimensions, and wind effects. Second, a multi-sensor-based navigation architecture based on an Extended Kalman Filter is presented which integrates visual estimates with inertial and GNSS measurements and includes different operating modes and ad hoc integrity checks. The presented processing pipeline is built to provide the required navigation performance in different conditions including day/night flight, atmospheric disturbances, low visibility, and can support the autonomous initialization of a missed approach procedure. Third, performance assessment of the proposed architecture is conducted within a highly realistic simulation environment which reproduces real world scenarios and includes variable weather and illumination conditions. Results show that the proposed architecture is robust with respect to dynamic and environmental challenges, providing cm-level positioning uncertainty in the final landing phase. Furthermore, autonomous initialization of a Missed Approach Procedure is demonstrated in case of loss of visual contact with the landing pad and consequent increase of the self-estimated navigation uncertainty.
File in questo prodotto:
Non ci sono file associati a questo prodotto.