Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ


Choose your language

InfoQ Homepage News Facebook Publishes New Neural Machine Translation Algorithm

Facebook Publishes New Neural Machine Translation Algorithm

This item in japanese

Facebook’s Artificial Intelligence Research team published research results using a new approach for neural machine translation (NMT). Their algorithm scores higher than any other system on three established machine translation tasks, and runs nine times faster than Google's NMT system.

Facebook’s technique uses convolutional neural networks, a technique popular in the field of computer vision. This technique processes sentences in a hierarchical order; this way it captures complex relations occurring in a sentence. Facebook trains the computer to give meaning to parts of a sentence (consisting of two, three, four, or even more words next to each other). By processing the sentence with these networks the computer gets a notion of what every part of the sentence means. A different neural network turns this representation of meaning back into another language.

The main advantage of the convolutional method is that you can apply it on multiple parts of a sentence at the same time. Traditional NMT methods read a sentence word by word, and remember what the sentence meant up to that point. The speed of the computer throttles the sequential reading speed. And the result is that Facebook’s algorithm is up to nine times faster than the sequential reading methods.

They also introduced a new technique called "multi-hop." Instead of reading the whole sentence and then writing the whole translated sentence, the network chooses what words from the original text to focus on while translating word by word. Multi-hop is a new technique that provides smarter and more complicated alternatives to the "attention" mechanisms. Attention mechanisms are the key to the "multiple meanings of words" problem. Based on the context of a word, a word has different translations. Attention mechanisms solve this problem by, while translating a word, focusing on relevant parts of the source sentence to determine a good translation.

Facebook plans to use their new approach for other text processing tasks. An example is using neural networks to answer questions. With their new approach, they can simultaneously focus on distinct parts of a conversation. They describe their complete approach in a blog post and a freely accessible paper.  People who want to try their algorithm can do so by downloading the code for free from GitHub.

The new algorithm performed better than any other algorithm on the standard English-French, English-German, and English-Romanian tasks. It beat Google’s neural network for machine translation. The algorithm by Google is now available to everyone using the Google Translate SDK, that supports 20 language pairs. People who want to see the difference between NMT and the old methods can select what translations they like best using Microsoft’s Bing translation.

[Click on the image to enlarge it]

Rate this Article