We address the detection, tracking, and relative localization of the agents of a drone swarm from a human perspective using a headset equipped with a single camera and an Inertial Measurement Unit (IMU). We train and deploy a deep neural network detector on image data to detect the drones. A joint probabilistic data association filter resolves the detection problems and couples this information with IMU data to track the agents. In order to estimate the drones' relative poses in 3D space with respect to the human, we use an additional deep neural network that processes image regions of the drones provided by the tracker. Finally, to speed up the deep neural networks' training, we introduce an automated labeling process. The effectiveness of the proposed approach is validated by several experimental results. The approach is real-time and does not rely on any communication between the human and the drones. It can be used to spatially task a swarm of drones and also employed for formation control and coordination of terrestrial vehicles.
Tracking and Relative Localization of Drone Swarms with a Vision-based Headset / Pavliv, M.; Schiano, F.; Reardon, C. M.; Floreano, D.; Loianno, G.. - In: IEEE ROBOTICS AND AUTOMATION LETTERS. - ISSN 2377-3766. - 6:2(2021), pp. 1455-1462. [10.1109/LRA.2021.3051565]
Tracking and Relative Localization of Drone Swarms with a Vision-based Headset
Schiano F.;Loianno G.
2021
Abstract
We address the detection, tracking, and relative localization of the agents of a drone swarm from a human perspective using a headset equipped with a single camera and an Inertial Measurement Unit (IMU). We train and deploy a deep neural network detector on image data to detect the drones. A joint probabilistic data association filter resolves the detection problems and couples this information with IMU data to track the agents. In order to estimate the drones' relative poses in 3D space with respect to the human, we use an additional deep neural network that processes image regions of the drones provided by the tracker. Finally, to speed up the deep neural networks' training, we introduce an automated labeling process. The effectiveness of the proposed approach is validated by several experimental results. The approach is real-time and does not rely on any communication between the human and the drones. It can be used to spatially task a swarm of drones and also employed for formation control and coordination of terrestrial vehicles.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.