WIT3

Web Inventory of Transcribed and Translated Talks

Home 2011-01 MT results

For the language pairs of the IWSLT 2011 evaluation campaign, results of baseline SMT systems are provided below.

The baseline systems were built upon the open-source MT toolkit Moses in a pretty standard configuration; see the WIT3 paper for details.

Performance in terms of BLEU% and TER were computed by means of the MultEval software and refer to tst2010 evaluation set. Automatic translations are linked to entries, just click for getting them. 


en

fr
arBleu=22.13
TER=59.38
 
en Bleu=28.46
TER=51.69
zh-cnBleu=11.12
TER=76.39