WeARHand: Head-worn, RGB-D camera-based, bare-hand user interface with visually enhanced depth perception

ISMAR(2014)

引用 78|浏览291
暂无评分
摘要
We introduce WeARHand, which allows a user to manipulate virtual 3D objects with a bare hand in a wearable augmented reality (AR) environment. Our method uses no environmentally tethered tracking devices and localizes a pair of near-range and far-range RGB-D cameras mounted on a head-worn display and a moving bare hand in 3D space by exploiting depth input data. Depth perception is enhanced through egocentric visual feedback, including a semi-transparent proxy hand. We implement a virtual hand interaction technique and feedback approaches, and evaluate their performance and usability. The proposed method can apply to many 3D interaction scenarios using hands in a wearable AR environment, such as AR information browsing, maintenance, design, and games.
更多
查看译文
关键词
semitransparent proxy hand,egocentric visual feedback,wearable computing,helmet mounted displays,3d user interfaces,ar information browsing,ar maintenance,ar environment,ar games,user interfaces,hand interaction,wearhand,depth perception,red-green-blue-depth camera,virtual 3d object manipulation,wearable augmented reality,ar design,head-worn display,augmented reality,cameras,tethered tracking devices,rgb-d camera,bare-hand user interface
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要