Innovation: Welocalize NLP Experts Customize NMT with Google AutoML Translate

welocalize September 5, 2018

At the Google Cloud Next ’18 conference, the Google AutoML product team featured Welocalize’s experiments with training and customizing Google’s AutoML Translate engines for numerous, disparate domains and languages. AutoML Translate is a solution for customizing neural machine translation engines for specific industries and domains. Welocalize evaluated AutoML Translate for scale, speed, and accuracy against generic neural engines, as well as customized neural and statistical engines. The results indicate immediate, practical application for Welocalize clients who see machine translation (MT) as a key component of their enterprise localization program.

EARLY ACCESS PARTNER FOR GOOGLE AUTOML TRANSLATE

Welocalize brings a great deal of neural machine translation experience, helping enterprise customers localize their content for global audiences. We’re delighted to work with Welocalize as an early access partner on Google AutoML Translate. They’ve helped us develop a better understanding of the NMT space, specifically with the process of gathering translated sentence pairs for training models and analyzing the quality of models after training. We look forward to continue working with the Welocalize team and advancing Google’s NMT and natural language processing offerings.” Sara Robinson, Developer Advocate, Google.

Welocalize evaluated AutoML Translate for two clients:  Blackboard and OpenTable.

Results of these evaluations were presented at Google Cloud Next ’18 by Welocalize VP of Language Services, Olga Beregovaya and Senior Manager, MT and NLP Deployments, Alex Yanishevsky.

Welocalize Presentation at GoogleNext from doing things differently on Vimeo.

OPENTABLE EVALUATES NMT FOR USER GENERATED CONTENT (UGC)

OpenTable’s global audiences rely on restaurant reviews in their preferred language, whether it’s a French visitor choosing a cafe in Berlin or a UK diner searching for a Paris-based bistro. Training an NMT engine in the unique terminology and jargon that is particular to UGC results in translations with higher contextual accuracy and relevance.  This means multilingual content is published quicker and meets OpenTable’s challenges of ever-increasing content volumes.

BLACKBOARD’S NMT EXPERIMENT

Blackboard develops online learning management systems used by enterprises like ADP, Bayer, and United Educators. Welocalize developed and has been managing Blackboard’s localization program since 2015.

The evaluation tested if a customized NMT engine would be able to achieve the level of quality that could potentially minimize post-editing. For Blackboard, by reducing the post-editing process , this ultimately will also reduce the translation costs by increasing post-editing discounts. The MT output is much closer to the desired quality primarily due to fluency and correct lexical choices, meaning lower post-editing costs.  The initial results for time it takes to train NMT engines using AutoML Translate and the quality level of the NMT engine output are indeed promising.

 

Welocalize is recognized as a thought leader and innovation partner in the area of MT, managing some of the most advanced enterprise MT programs in the industry.  As these clients work with Welocalize to advance their programs to include neural MT and other machine learning applications, Welocalize continuously evaluates partner solutions like Google AutoML Translate.

“Our collaboration with Google AutoML went way beyond bug fixing. We were able to share our in-depth expertise and help to affect the product roadmap. Welocalize is a trusted advisor for enterprise organizations looking to integrate NMT into their commercial localization program.” Olga Beregovaya, Welocalize VP of Language Services, Welocalize.

Click here for more information or contact us directly to discuss your MT requirements.

Hear how Welocalize supports multilingual e-discovery with MT in this recent webinar: ‘Accelerating Global E-Discovery: MT + Litigation’. Click here to view.