谷歌浏览器插件
订阅小程序
在清言上使用

atmoSphere: mindfulness over haptic-audio cross modal correspondence.

UbiComp '17: The 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing Maui Hawaii September, 2017(2017)

引用 16|浏览5
暂无评分
摘要
We explore cross-modal correspondence between haptic and audio output for meditation support. To this end, we implement atmoSphere, a haptic ball to prototype several haptic/audio designs. AtmoSphere consists of a sphere shaped device which provides haptic feedback. The users can experience the design aimed at instructing them in breathing techniques shown to enhance meditation. The aim of the haptic/audio design is to guide the user into a particular rhythm of breathing. We detect this rhythm using smart eyewear (J!NS MEME) that estimates cardiac and respiratory parameters using embedded motion sensors. Once this rhythm is achieved the feedback stops. If the user drops out of the rhythm, the haptic/audio feedback starts again.
更多
查看译文
关键词
Attention, Psychophysiology, Eyewear, Tracking, Sensing, Haptics
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要