Improving Factual Error Correction for Abstractive Summarization via Data Distillation and Conditional-generation Cloze
CoRR(2024)
摘要
Improving factual consistency in abstractive summarization has been a focus
of current research. One promising approach is the post-editing method.
However, previous works have yet to make sufficient use of factual factors in
summaries and suffers from the negative effect of the training datasets. In
this paper, we first propose a novel factual error correction model FactCloze
based on a conditional-generation cloze task. FactCloze can construct the
causality among factual factors while being able to determine whether the blank
can be answered or not. Then, we propose a data distillation method to generate
a more faithful summarization dataset SummDSC via multiple-dimensional
evaluation. We experimentally validate the effectiveness of our approach, which
leads to an improvement in multiple factual consistency metrics compared to
baselines.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要