Boundary-Aware Abstractive Summarization with Entity-Augmented Attention for Enhancing Faithfulness

Jiuyi Li,Junpeng Liu, Jianjun Ma, Wei Yang,Degen Huang

ACM Transactions on Asian and Low-Resource Language Information Processing(2022)

引用 0|浏览0
暂无评分
摘要
With the successful application of deep learning, document summarization systems can produce more readable results. However, abstractive summarization still suffers from unfaithful outputs and factual errors, especially in named entities. Current approaches tend to employ external knowledge to improve model performance while neglecting the boundary information and the semantics of the entities. In this paper, we propose an entity-augmented method (EAM) to encourage the model to make full use of the entity boundary information and pay more attention to the critical entities. Experimental results on three Chinese and English summarization datasets show that our method outperforms several strong baselines and achieves state-of-the-art performance on the CLTS dataset. Our method can also improve the faithfulness of the summary and generalize well to different pre-trained language models. Moreover, we propose a method to evaluate the integrity of generated entities. Besides, we adapt the data augmentation method in the FactCC model according to the difference between Chinese and English in grammar and train a new evaluation model for factual consistency evaluation in Chinese summarization.
更多
查看译文
关键词
Abstractive text summarization,factual consistency,entity-augmented
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要