Imagine visiting Japan and being able to read and understand signs with the same level of comprehension as a native speaker—without having to take years of Japanese lessons. That vision now is a reality with the latest update of Google’s Translate app.
Google Translate’s Word Lens augmented reality (AR) feature now allows users to point their smartphone cameras at Japanese text and instantly see English interpretations on the screen, as reported by Mac Rumors. The feature also works in reverse, allowing Japanese visitors to understand signs and other text in English-speaking countries.
Before the update, users could take a picture of Japanese text with their smartphone camera and see an English translation. However, the AR capabilities of Word Lens provide a much faster and more convenient solution to translation of signs, helping visitors orient themselves more easily, according to Google.
The Word Lens feature allows instant text translation in 30 languages. In the camera mode, users can take pictures to get higher-quality translations in 37 languages, the company added.
The Google Translate app is available on both Android and iOS.
Google recently completed a major upgrade of its Translate app using neural network technology. The company’s neural network took a fresh approach to translation, developing its own language to provide a more sophisticated form of interpretation. This new lingua franca allowed the neural network to identify commonalities between sentences and perform translations between multiple languages, even tongues it wasn’t programmed to support.
With this advance, it seems likely that Google will upgrade Word Lens to provide instant translations for more languages in the future.
Pratik Dhebri is senior director of product management at Veritone. He works with AI developers and cognitive engine providers to architect and monetize algorithms and applications. Learn more about our platform and join the Veritone developer ecosystem today.