MixCE: Training Autoregressive Language Models by Mixing Forward and Reverse Cross-Entropies.
PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023) LONG PAPERS, VOL 1(2023)
Key words
Language Modeling,Machine Translation,Topic Modeling,Statistical Language Modeling,Syntax-based Translation Models
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined