BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News Zero-Shot Translation with Google Neural Machine Translation System

Zero-Shot Translation with Google Neural Machine Translation System

Bookmarks

Google announced late last year that it had applied machine learning to its Google translate service, resulting in a neural network capable of "zero-shot" translation.

Zero-shot is translating phrases for language pairs where no explicit training or mapping exists. The trained neural network surprised researchers when evidence of an interlingua emerged as a path for translating previously unpaired languages and phrase. Researchers indicated that data visualizations of the new system in action provided early evidence of shared semantic representations or interlingua between languages. This is presented as evidence of the neural network generating its own procedures for more efficient translation all by itself.

Google Translate trajectory over the past 10 years has gone from a few languages to 103 supported languages, translating over 140 billion languages per day. Motivations for implementing neural network to improve accuracy and efficiency are the many successes of neural network application in other fields.

A key question presented in the findings is whether or not one can translate between a language pair that hadn't been paired before, but might have some secondary path connecting them, for example English to Korean, Korean to Japanese, and then inferencing English to Japanese.

The team claimed they could implement the model without changing the core Translate model, which includes encoder, decoder and attention. Some details of GNMT indicate using a bidirectional recurrent neural network to encode words passed to a decoder to predict target language words, but it's not immediately clear what the similarities or disimilarities are between this model and the zero-shot translation model, detailed in a recent research publication. Google claimed the GNMT:

...reduces translation errors by an average of 60% compared to Google's phrase-based production system ... [and] on the WMT'14 English-to-French and English-to-German benchmarks, GNMT achieves competitive results to state-of-the-art. Using a human side-by-side evaluation on a set of isolated simple sentences, it reduces translation errors by an average of 60% compared to Google's phrase-based production system.

Input words are tagged with a target language, but not a source language. Google noted that:

Not specifying the source language has the potential disadvantage that words with the same spelling but different meaning from different source languages can be ambiguous to translate, but the advantage is that it is simpler and we can handle input with code-switching. We find that in almost all cases, context provides enough language evidence to produce the correct translation.

The zero-shot network:

...consists of a deep LSTM network with 8 encoder and 8 decoder layers using attention and residual connections ... [and] learn(s) to perform implicit bridging between language pairs never seen explicitly during training, showing that transfer learning and zero-shot translation is possible for neural translation.

While immediate improvement in translation quality was evident, scalability presents itself as a focus of continued efforts to have the solution support all 103 languages. The publically available, production version of the GNMT system serves 10 of 16 recent language additions.

Rate this Article

Adoption
Style

BT