Earlier Google Translate techniques like neural machine translation, re-writing based paradigms, and on-device processing, contributed to the growth of the platform, but recently is lagging. Keeping into consideration the magnitude of the problem, efforts beyond Google has been made.
Google is now making enhancements toward quality language translation for better user experience. It states that Translate improved an average of 5 points across all languages and 7 points across the lowest resource languages. Furthermore, Translate has become more robust towards machine translation hallucinations-AI model that produces strange translations.
The hybrid architecture model is more stable than 4year old RNN and even higher in quality.
Google also upgraded the old crawler that compiled training data. The new miner is embedded with 14 language pairs, focused more on precision than recall, increasing the extracted number of sentences by 29.
Google implemented a curriculum learning- AI approach that trains on all data to treat noisy data.
For the low-resource languages, Google deployed a back-translation scheme that intensifies parallel training data.
A single giant M4 model is enabling transfer learning in Translate.
The tech giant is providing translations even of the lowest-resource 108 languages and addressing all ways through Google Translate Community.
#AIMonks #AI #Google #GoogleTranslate #Translate #Languages #Data
0 Comments