Top translation memory tips from localization professionals 3 days ago |

We asked our customers and colleagues for their top translation memory (TM) tips — things they wanted to share from their own experience that have helped make translating more effortless, and more efficient.Download this short document and discover useful advice given by translation professionals on how to get the best use of your all-important TMs.

translation xl8
Translation is about making choices: Daniel Hann - Delhi Post 7 days ago |

British writer, editor and translator Daniel Hann translates mostly literary fiction from Portuguese, Spanish and French. As a polyglot, he is known for his translations of ‘The Book of Chameleons’ by Portuguese-Angolan author José Eduardo Agualusa which won the 2007 Independent Foreign Fiction Prize. Agualusa’s ‘A General Theory of Oblivion’ which was translated by Hann and shortlisted for …

That Chinese "BreedReady" Database – Check The Translation | The Continental Telegraph 9 days ago |

A basic truism is that languages don't map exactly over each other and that's the most likely explanation for this database from China detailing "BreedReady" women. That languages don't map exactly should be obvious even to the most monolingual of English speakers. We all know that "Let's have lunch sometime" when said by an American means "Hope to see you never and definitely not while eating". Similarly, "That's lovely" when said by a Brit does not necessarily mean it is lovely and "How quaint" isn't praise for the cuteness of the thing. A Californian invocation to meet Tuesday is in fact a rumination on the possible non-existence of Tuesday. Thus we shouldn't be taking the naming of this database all that seriously: That they're "ready to breed" isn't quite what the database is recording: Yes, quite: Presumed fertile might be a more reasonable English version of the term. And there's no particular reason to think that it must be about breeding either, at least not directly. If

Better Language Models and Their Implications 12 days ago |

We’ve trained a large-scale unsupervised language model which generates coherent paragraphs of text, achieves state-of-the-art performance on many language modeling benchmarks, and performs rudimentary reading comprehension, machine translation, question answering, and summarization — all without task-specific training. View codeRead paperRead more Our model, called GPT-2 (a successor to GPT), was