3D-ARM-Gaze: a public dataset of 3D Arm Reaching Movements with Gaze information in virtual reality

Bianca Lento,Effie Segas,Vincent Leconte, Emilie Doat,Frederic Danion,Renaud Peteri, Jenny Benois Pineau,Aymar de Rugy

biorxiv(2024)

引用 0|浏览0
暂无评分
摘要
3D-ARM-Gaze is a public dataset designed to provide natural arm movements together with visual and gaze information when reaching objects in a wide reachable space from a precisely controlled, comfortably seated posture. Participants were involved in picking and placing objects in various positions and orientations in a virtual environment, whereby a specific procedure maximized the workspace explored while ensuring a consistent seated posture. The dataset regroups more than 2.5 million samples recorded from 20 healthy participants performing 14 000 single pick-and-place movements (700 per participant). While initially designed to explore novel prosthesis control strategies based on natural eye-hand and arm coordination, this dataset will also be useful to researchers interested in core sensorimotor control, humanoid robotics, human-robot interactions, as well as for the development and testing of associated solutions in gaze-guided computer vision. ### Competing Interest Statement The authors have declared no competing interest.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要