After a little over a month of learning more languages to translate beyond Spanish, Google’s recently announced Neural Machine Translation system has used deep learning to develop its own internal language. TechCrunch reports: GNMT’s creators were curious about something. If you teach the translation system to translate English to Korean and vice versa, and also English to Japanese and vice versa… could it translate Korean to Japanese, without resorting to English as a bridge between them? They made this helpful gif to illustrate the idea of what they call “zero-shot translation” (it’s the orange one). As it turns out — yes! It produces “reasonable” translations between two languages that it has not explicitly linked in any way. Remember, no English allowed. But this raised a second question. If the computer is able to make connections between concepts and words that have not been formally linked… does that mean that the computer has formed a concept of shared meaning for those words, meaning at a deeper level than simply that one word or phrase is the equivalent of another? In other words, has the computer developed its own internal language to represent the concepts it uses to translate between other languages? Based on how various sentences are related to one another in the memory space of the neural network, Google’s language and AI boffins think that it has. The paper describing the researchers’ work (primarily on efficient multi-language translation but touching on the mysterious interlingua) can be read at Arxiv.
Read more of this story at Slashdot.