谷歌浏览器插件
订阅小程序
在清言上使用

Coverage Embedding Models for Neural Machine Translation.

EMNLP(2016)

引用 185|浏览489
暂无评分
摘要
In this paper, we enhance the attention-based neural machine translation (NMT) by adding explicit coverage embedding models to alleviate issues of repeating and dropping translations in NMT. For each source word, our model starts with a full coverage embedding vector to track the coverage status, and then keeps updating it with neural networks as the translation goes. Experiments on the large-scale Chinese-to-English task show that our enhanced model improves the translation quality significantly on various test sets over the strong large vocabulary NMT system.
更多
查看译文
关键词
neural machine translation,embedding,coverage,model
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要