목록기계번역 (4)
while (1): study();

출처: https://dblp.org/rec/conf/aaai/WangXZBQLL18.html dblp: Dual Transfer Learning for Neural Machine Translation with Marginal Distribution Regularization. For web page which are no longer available, try to retrieve content from the of the Internet Archive (if available). load content from web.archive.org Privacy notice: By enabling the option above, your browser will contact the API of web.arch..

출처: https://arxiv.org/abs/1611.00179 Dual Learning for Machine Translation While neural machine translation (NMT) is making good progress in the past two years, tens of millions of bilingual sentence pairs are needed for its training. However, human labeling is very costly. To tackle this training data bottleneck, we develop a du arxiv.org 이번에 기계 번역기 구현 다 마치고, colab 환경에서 dual learning을 이용한 fine-..

출처: https://arxiv.org/abs/1808.09381 Understanding Back-Translation at Scale An effective method to improve neural machine translation with monolingual data is to augment the parallel training corpus with back-translations of target language sentences. This work broadens the understanding of back-translation and investigates a numb arxiv.org 페이스북과 구글의 공동연구로 2018년 발표한 글입니다. 1. Introduction 기계번역기를..

ArXiv 링크: https://arxiv.org/abs/1609.08144 Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation Neural Machine Translation (NMT) is an end-to-end learning approach for automated translation, with the potential to overcome many of the weaknesses of conventional phrase-based translation systems. Unfortunately, NMT systems are known to be computationall..