Abstractive document summarization via multi-template decoding

Applied Intelligence(2022)

引用 2|浏览9
暂无评分
摘要
Most previous abstractive summarization models generate the summary in a left-to-right manner without making the most use of target-side global information . Recently, many researchers seek to alleviate this issue by retrieving target-side templates from large-scale training corpus, yet have limitations in template quality. To overcome the problem of template selection bias, one promising direction is to get better target-side global information from multiple high-quality templates. Hence, this paper extends the encoder-decoder framework by introducing a multi-template decoding mechanism , which can utilize multiple templates retrieved from the training corpus based on the semantic distance. In addition, we introduce a multi-granular attention mechanism by simultaneously taking into account the importance of words in templates and the importance of different templates. Extensive experiment results on CNN/Daily mail and English Gigaword show that our proposed model significantly outperforms several state-of-the-art abstractive and extractive baseline models.
更多
查看译文
关键词
Abstractive document summarization,Multiple templates,Target-side global information,Multi-granular attention
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要