Summarization of scholarly articles using BERT and BiGRU: Deep learning-based extractive approach

Journal of King Saud University - Computer and Information Sciences(2023)

引用 0|浏览1
暂无评分
摘要
Extractive text summarization involves selecting and combining key sentences directly from the original text, rather than generating new content. While various methods, both statistical and graph-based, have been explored for this purpose, accurately capturing the intended meaning remains a challenge. To address this, researchers are investigating innovative techniques that harness deep learning models like BERT (Bidirectional Encoder Representations from Transformers). However, BERT has limitations in summarizing lengthy documents due to input length constraints. To find a more effective solution, we propose a novel approach. This approach combines the power of BERT, a transformer network pre-trained on extensive self-supervised datasets, with BiGRU (Bidirectional Gated Recurrent Units), a recurrent neural network that captures sequential dependencies within the text for extracting salient information. Our method involves using BERT to generate sentence-level embeddings, which are then fed into the BiGRU network. This allows us to achieve a comprehensive understanding of the complete document’s context. In experimental analysis conducted on arXiv and PubMed datasets, the proposed approach outperformed several state-of-the-art models. It achieved remarkable ROUGE-F scores of (46.7, 19.4, 35.4) and (47.0, 21.3, 39.7) on these datasets respectively. The proposed fusion of BERT and BiGRU significantly enhances extractive text summarization. It shows promising potential for summarizing lengthy documents and benefiting various domains that require concise and informative summaries.
更多
查看译文
关键词
Text summarization,Attention mechanism,BERT,BiGRU
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要