Learning Multi-Level Information for Dialogue Response Selection by Highway Recurrent Transformer
Computer Speech & Language(2020)
摘要
•A new variant of attention mechanisms focuses on modeling cross-sentence attention.•A novel model integrates highway attention in Transformer for modeling dialogues.•Our model is capable of modeling complex dialogue-level information.•The results on two response selection datasets show consistent performance.
更多查看译文
关键词
Response selection,Transformer,Attention mechanism,Dialogue,DSTC
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要