Leveraging Pre-trained Checkpoints for Sequence Generation Tasks.

Trans. Assoc. Comput. Linguistics(2020)

引用 443|浏览315
暂无评分
摘要
Unsupervised pre-training of large neural models has recently revolutionized Natural Language Processing. By warm-starting from the publicly released checkpoints, NLP practitioners have pushed the ...
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要