A Neural Attention Model for Abstractive Sentence Summarization

Conference on Empirical Methods in Natural Language Processing(2015)

引用 3419|浏览1096
暂无评分
摘要
Summarization based on text extraction is inherently limited, but generation-style abstractive methods have proven challenging to build. In this work, we propose a fully data-driven approach to abstractive sentence summarization. Our method utilizes a local attention-based model that generates each word of the summary conditioned on the input sentence. While the model is structurally simple, it can easily be trained end-to-end and scales to a large amount of training data. The model shows significant performance gains on the DUC-2004 shared task compared with several strong baselines.
更多
查看译文
关键词
Probability distribution,Sentence,Automatic summarization,Indicator vector,Speech recognition,Computer science
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要