Study of Neural Machine Translation With Long Short-Term Memory Techniques

Deep Learning Research Applications for Natural Language ProcessingAdvances in Computational Intelligence and Robotics(2022)

引用 0|浏览1
暂无评分
摘要
The growing demand for having a conversation amongst people who come from different areas, across the globe, resulting from globalization, has led to the development of systems like machine translations. There are techniques like statistical models, Bayesian models, etc. that were used earlier for machine translations. However, with growing expectations towards better accuracies, neural networks aided systems for translations termed as neural machine translations (NMT) have come up. Models have been proposed by several organizations like Google NMT (G-NMT) that are widely accepted and implemented. Several machine translations are also based on RNN models. This work studies neural machine translations with respect to long short-term memory (LSTM) network and compares them on the basis of several widely accepted accuracy metrics like BLEU score, precision, recall, and F1 score. Further, a combination of two LSTM models is implemented for better accuracy. This work analyzes the various LSTM models on the basis of these metrics.
更多
查看译文
关键词
neural machine translation,memory,short-term
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要