Transfer Learning from Transformers to Fake News Challenge Stance Detection (FNC-1) Task.

Slovikovskaya Valeriya,Giuseppe Attardi

LREC(2020)

引用 56|浏览100
暂无评分
摘要
In this paper, we report improved results of the Fake News Challenge Stage 1 (FNC-1) stance detection task. This gain in performance is due to the generalization power of large language models based on Transformer architecture, invented, trained and publicly released over the last two years. Specifically (1) we improved the FNC-1 best performing model adding BERT sentence embedding of input sequences as a model feature, (2) we fine-tuned BERT, XLNet, and RoBERTa transformers on FNC-1 extended dataset and obtained state-of-the-art results on FNC-1 task.
更多
查看译文
关键词
fake news detection, transfer learning, fine-tuning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要