Visual-Inertial Filtering for Human Walking Quantification

2021 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2021)(2021)

引用 4|浏览10
暂无评分
摘要
We propose a novel system to track human lower-body motion as part of a larger movement assessment system for clinical evaluation. Our system combines multiple wearable Inertial Measurement Unit (IMU) sensors and a single external RGB-D camera. We use a factor graph with a Sliding Window Filter (SWF) formulation that merges 2-D joint data extracted from the RGB images via a Deep Neural Network, raw depth information, raw IMU gyroscope readings, and estimated foot contacts extracted from IMU gyroscope and accelerometer data. For the system, we use an articulated model of human body motion based on differential manifolds. We compare the results of our system against a gold-standard motion capture system and a vision-only alternative. Our proposed system qualitatively presents smoother 3D joint trajectories when compared to noisy depth data, allowing for more realistic gait estimations. At the same time, with respect to the vision-only baseline, it improves the median of the joint trajectories by around 2 cm, while considerably reducing outliers by up to 0.6 m.
更多
查看译文
关键词
visual-Inertial filtering,human walking quantification,human lower-body motion,larger movement assessment system,clinical evaluation,multiple wearable Inertial Measurement Unit sensors,single external RGB-D camera,Sliding Window Filter formulation,RGB images,Deep Neural Network,raw depth information,raw IMU gyroscope readings,accelerometer data,noisy depth data,2D joint data,gait estimations,SWF formulation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要