Το work with title A ROS multi-tier UAV localization module based on GNSS, inertial and visual-depth data by Antonopoulos Angelos, Lagoudakis Michail, Partsinevelos Panagiotis is licensed under Creative Commons Attribution 4.0 International
Bibliographic Citation
A. Antonopoulos, M. G. Lagoudakis, and P. Partsinevelos, “A ROS multi-tier UAV localization module based on GNSS, inertial and visual-depth data,” Drones, vol. 6, no. 6, May 2022, doi: 10.3390/drones6060135.
https://doi.org/10.3390/drones6060135
Uncrewed aerial vehicles (UAVs) are continuously gaining popularity in a wide spectrum of applications, while their positioning and navigation most often relies on Global Navigation Satellite Systems (GNSS). However, numerous conditions and practices require UAV operation in GNSS-denied environments, including confined spaces, urban canyons, vegetated areas and indoor places. For the purposes of this study, an integrated UAV navigation system was designed and implemented which utilizes GNSS, visual, depth and inertial data to provide real-time localization. The implementation is built as a package for the Robotic Operation System (ROS) environment to allow ease of integration in various systems. The system can be autonomously adjusted to the flight environment, providing spatial awareness to the aircraft. This system expands the functionality of UAVs, as it enables navigation even in GNSS-denied environments. This integrated positional system provides the means to support fully autonomous navigation under mixed environments, or malfunctioning conditions. Experiments show the capability of the system to provide adequate results in open, confined and mixed spaces.