DQ-BART: Efficient Sequence-to-Sequence Model via Joint Distillation and Quantization

PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022): (SHORT PAPERS), VOL 2(2022)

引用 40|浏览245
暂无评分
关键词
joint distillation,model,dq-bart,sequence-to-sequence
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要