Laser-visual-inertial odometry and mapping with high robustness and low drift: ZHANG and SINGH

JOURNAL OF FIELD ROBOTICS(2018)

引用 153|浏览168
暂无评分
摘要
We present a data processing pipeline to online estimate ego-motion and build a map of the traversed environment, leveraging data from a 3D laser scanner, a camera, and an inertial measurement unit (IMU). Different from traditional methods that use a Kalman filter or factor-graph optimization, the proposed method employs a sequential, multilayer processing pipeline, solving for motion from coarse to fine. Starting with IMU mechanization for motion prediction, a visual-inertial coupled method estimates motion; then, a scan matching method further refines the motion estimates and registers maps. The resulting system enables high-frequency, low-latency ego-motion estimation, along with dense, accurate 3D map registration. Further, the method is capable of handling sensor degradation by automatic reconfiguration bypassing failure modules. Therefore, it can operate in the presence of highly dynamic motion as well as in the dark, texture-less, and structure-less environments. During experiments, the method demonstrates 0.22% of relative position drift over 9.3 km of navigation and robustness w.r.t. running, jumping, and even highway speed driving (up to 33 m/s).
更多
查看译文
关键词
mapping,ego-motion estimation,3D laser scanner,vision,inertial sensor
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要