A Spline-Based Trajectory Representation for Sensor Fusion and Rolling Shutter Cameras

International Journal of Computer Vision(2015)

引用 104|浏览164
暂无评分
摘要
The use of multiple sensors for ego-motion estimation is an approach often used to provide more accurate and robust results. However, when representing ego-motion as a discrete series of poses, fusing information of unsynchronized sensors is not straightforward. The framework described in this paper aims to provide a unified solution for solving ego-motion estimation problems involving high-rate unsynchronized devices. Instead of a discrete-time pose representation, we present a continuous-time formulation that makes use of cumulative cubic B-Splines parameterized in the Lie Algebra of the group 𝕊𝔼3 . This trajectory representation has several advantages for sensor fusion: (1) it has local control, which enables sliding window implementations; (2) it is C^2 continuous, allowing predictions of inertial measurements; (3) it closely matches torque-minimal motions; (4) it has no singularities when representing rotations; (5) it easily handles measurements from multiple sensors arriving a different times when timestamps are available; and (6) it deals with rolling shutter cameras naturally. We apply this continuous-time framework to visual–inertial simultaneous localization and mapping and show that it can also be used to calibrate the entire system.
更多
查看译文
关键词
Sensor fusion,Visual–inertial,SLAM,Rolling shutter,Calibration
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要