Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Reports, Documentation, and Standards
  4. Translation Error Spotting from a User's Point of View
 
report

Translation Error Spotting from a User's Point of View

Meyer, Thomas  
2012

The evaluation of errors made by Machine Translation (MT) systems still needs human effort despite the fact that there are automated MT evaluation tools, such as the BLEU metric. Moreover, assuming that there would be tools that support humans in this translation quality checking task, for example by automatically marking some errors found in the MT system output, there is no guarantee that this actually helps to achieve a more correct or faster human evaluation. The paper presents a user study which found statistically significant interaction effects for the task of finding MT errors under the conditions of non-annotated and automatically pre-annotated errors, in terms of the time needed to complete the task and the number of correctly found errors.

  • Files
  • Details
  • Metrics
Loading...
Thumbnail Image
Name

Meyer_Idiap-RR-31-2012.pdf

Access type

openaccess

Size

821.15 KB

Format

Adobe PDF

Checksum (MD5)

6d98fb6fe79fed102ec84e342d61ff40

Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés