Behavioral And Emotional Spoken Cues Related To Mental States In Human-Robot Social Interaction

ICMI-MLMI(2015)

引用 7|浏览41
暂无评分
摘要
Understanding human behavioral and emotional cues occurring in interaction has become a major research interest due to the emergence of numerous applications such as in social robotics. While there is agreement across different theories that some behavioral signals are involved in communicating information, there is a lack of consensus regarding their specificity, their universality, and whether they convey emotions, affective, cognitive, mental states or all of those. Our goal in this study is to explore the relationship between behavioral and emotional cues extracted from speech (e.g., laughter, speech duration, negative emotions) with different communicative information about the human participant. This study is based on a corpus of audio/video data of humorous interactions between the Nao robot and 37 human participants. Participants filled three questionnaires about their personality, sense of humor and mental states regarding the interaction. This work reveals the existence of many links between behavioral and emotional cues and the mental states reported by human participants through self report questionnaires. However, we have not found a clear connection between reported mental states and participants profiles.
更多
查看译文
关键词
Social robots,Humor,Real-time Emotion Detection,Affective Interaction,Human-Robot Interaction
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要