GraphFlow+: Exploiting Conversation Flow in Conversational Machine Comprehension with Graph Neural Networks

Machine Intelligence Research(2024)

引用 0|浏览14
暂无评分
摘要
The conversation machine comprehension (MC) task aims to answer questions in the multi-turn conversation for a single passage. However, recent approaches don’t exploit information from historical conversations effectively, which results in some references and ellipsis in the current question cannot be recognized. In addition, these methods do not consider the rich semantic relationships between words when reasoning about the passage text. In this paper, we propose a novel model GraphFlow+, which constructs a context graph for each conversation turn and uses a unique recurrent graph neural network (GNN) to model the temporal dependencies between the context graphs of each turn. Specifically, we exploit three different ways to construct text graphs, including the dynamic graph, static graph, and hybrid graph that combines the two. Our experiments on CoQA, QuAC and DoQA show that the GraphFlow+ model can outperform the state-of-the-art approaches.
更多
查看译文
关键词
Conversational machine comprehension (MC),reading comprehension,question answering,graph neural networks (GNNs),natural language processing (NLP)
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要