谷歌浏览器插件
订阅小程序
在清言上使用

Multimodal Volume Data Exploration Through Mid-Air Haptics.

2022 IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY (ISMAR 2022)(2022)

引用 1|浏览14
暂无评分
摘要
We present a mid-air haptic rendering method to explore volumetric data in augmented reality (AR) environments. Users directly interact with a volume-rendered hologram using their bare hands. Since the volume rendering method accumulates color along with its transparency, the depth perception of the user’s hand within the hologram is vague when used with only visual feedback. Therefore, to enhance the localization of internal structures within the volume data, we propose an AR system that provides a tactile presence for boundaries and associated information (e.g., texture). Leveraging Low-High (LH) histograms, our system provides boundary information to the user both visually and haptically. Specifically, when the user interacts with the volume data using their hands, the system computes a set of focal points or a set of tactile patterns using a GPGPU-based estimation. Additionally, we developed mid-air haptic rendering methods using amplitude modulation (AM) and spatiotemporal modulation (STM) techniques. Both methods were implemented on HoloLens 2 and could run in real-time. The effectiveness of each proposed method was evaluated with respect to various volume data sets, including synthetic and computed tomography (CT) scan data. Our results show that while both haptic rendering methods produce tangible experiences from the hologram interaction, our spatiotemporal modulation method generates better shape discrimination performance with respect to the amplitude modulation method in the case of multiple region of interests.
更多
查看译文
关键词
Human-centered computing,Interaction paradigms,Mixed / augmented reality,Human-centered computing,Interaction devices,Haptic devices,Human-centered computing,Visualization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要