Sensor Fusion for Autonomous Indoor UAV Navigation in Confined Spaces.

International Conferences on Sensing Technology(2023)

引用 0|浏览0
暂无评分
摘要
In this paper, we address the challenge of navigating through unknown indoor environments using autonomous aerial robots within confined spaces. The core of our system involves the integration of key sensor technologies, including depth sensing from the ZED 2i camera, IMU data, and LiDAR measurements, facilitated by the Robot Operating System (ROS) and RTAB-Map. Through custom designed experiments, we demonstrate the robustness and effectiveness of this approach. Our results showcase a promising navigation accuracy, with errors as low as 0.4 meters, and mapping quality characterized by a Root Mean Square Error (RMSE) of just 0.13 m. Notably, this performance is achieved while maintaining energy efficiency and balanced resource allocation, addressing a crucial concern in UAV applications. Flight tests further underscore the precision of our system in maintaining desired flight orientations, with a remarkable error rate of only 0.1%. This work represents a significant stride in the development of autonomous indoor UAV navigation systems, with potential applications in search and rescue, facility inspection, and environmental monitoring within GPS-denied indoor environments.
更多
查看译文
关键词
Autonomous UAV Navigation,Multi-Sensor Fusion,SLAM,Deep Learning,Indoor Mapping
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要