OK, you have published valuable content on your website. How do you maximize its reach? One of the easiest ways is to go after new languages and foreign markets. How do you most efficiently get your web content to speak the language of foreign audiences and make sure you don't alienate them? Here are some key terms and tips for getting started in smart translation and localization. Let’s start with the obvious. There’s a whole industry of professional website translation services that exist for exactly the purpose of globalizing valuable content. My own agency is in this field, and my experience as
An interesting blog post from Google AI/Google Translate about machine translation drawing on multiple languages at the same time, which can help make better translations, especially for languages that have fewer resources to draw on. Excerpt:
Over the last few years there has been enormous progress in the quality of machine translation (MT) systems, breaking language barriers around the world thanks to the developments in neural machine translation (NMT). The success of NMT however, owes largely to the great amounts of supervised training data. But what about languages where data is scarce, or even absent? Multilingual NMT, with the inductive bias that “the learning signal from one language should benefit the quality of translation to other languages”, is a potential remedy. Multilingual machine translation processes multiple languages using a single translation model. The success of multilingual training for data-scarce languages has been demonstrated for automatic speech recognition and text-to-speech systems, and by prior research on multilingual translation [1,2,3]. We previously studied the effect of scaling up the number of languages that can be learned in a single neural network, while controlling the amount of training data per language. But what happens once all constraints are removed? Can we train a single model using all of the available data, despite the huge differences across languages in data size, scripts, complexity and domains? In “Massively Multilingual Neural Machine Translation in the Wild: Findings and Challenges” and follow-up papers [4,5,6,7], we push the limits of research on multilingual NMT by training a single NMT model on 25+ billion sentence pairs, from 100+ languages to and from English, with 50+ billion parameters. The result is an approach for massively multilingual, massive neural machine translation (M4) that demonstrates large quality improvements on both low- and high-resource languages and can be easily adapted to individual domains/languages, while showing great efficacy on cross-lingual downstream transfer tasks.
Read the whole post.
Today, a growing number of businesses attempt to successfully expand into international markets and approach foreign consumers who speak different languages, hoping to turn them into loyal customers and brand ambassadors. In a globalised world,
Over two decades ago, Boyd Tonkin founded Britain's first prize for literature in translation. In this extract from the introduction of his new book, he argues that we are always translating, even if we have never learned a single word of any other tongue - from regions and dialects to communicating across classes, communities, genders and sub-cultures
The chapter investigates the behavior of multi-word patterns in legal translation. Its objective is to explore the patterning of translator-mediated multilingual legislation with a view to gaining a better understanding of how the translation process