arxivst stuff from arxiv that you should probably bookmark

Neural Machine Translation Model with a Large Vocabulary Selected by Branching Entropy

Abstract · Apr 14, 2017 19:35 ·

nmt tsukuba vocabulary patent phrases luong words japan translation cs-cl

Arxiv Abstract

  • Zi Long
  • Takehito Utsuro
  • Tomoharu Mitsuhashi
  • Mikio Yamamoto

Neural machine translation (NMT), a new approach to machine translation, has achieved promising results comparable to those of traditional approaches such as statistical machine translation (SMT). Despite its recent success, NMT cannot handle a larger vocabulary because the training complexity and decoding complexity proportionally increase with the number of target words. This problem becomes even more serious when translating patent documents, which contain many technical terms that are observed infrequently. In this paper, we propose to select phrases that contain out-of-vocabulary words using the statistical approach of branching entropy. This allows the proposed NMT system to be applied to a translation task of any language pair without any language-specific knowledge about technical term identification. The selected phrases are then replaced with tokens during training and post-translated by the phrase translation table of SMT. Evaluation on Japanese-to-Chinese and Chinese-to-Japanese patent sentence translation proved the effectiveness of phrases selected with branching entropy, where the proposed NMT system achieves a substantial improvement over a baseline NMT system without our proposed technique.

Read the paper (pdf) »