[apologies for cross-posting]

CALL FOR PARTICIPATION

in the

Second Automatic Post-Editing (APE) shared task

at the First Conference on Machine Translation (WMT16)

--------------------------------------------------------------------

OVERVIEW

The second round of the APE shared task (
http://www.statmt.org/wmt16/ape-task.html) follows the first pilot round
organised in 2015. The aim is to examine automatic methods for correcting
errors produced by an unknown machine translation (MT) system. This has to
be done by exploiting knowledge acquired from human post-edits, which are
provided as training material.

This year the task focuses on the Information Technology (IT) domain, in
which English source sentences have been translated into German by an
unknown MT system and then manually post-edited by professional
translators. At training stage, the collected human post-edits have to be
used to learn correction rules for the APE systems. At test stage they will
be used for system evaluation with automatic metrics (TER and BLEU).

--------------------------------------------------------------------

GOALS

The aim of this task is to improve MT output in black-box scenarios, in
which the MT system is used "as is" and cannot be modified. From the
application point of view APE components would make it possible to:

   -

   Cope with systematic errors of an MT system whose decoding process is
   not accessible
   -

   Provide professional translators with improved MT output quality to
   reduce (human) post-editing effort
   -

   Adapt the output of a general-purpose system to the lexicon/style
   requested in a specific application domain


--------------------------------------------------------------------

DATA & EVALUATION

Training, development and test data consist in English-German triplets (source,
target and post-edit) belonging to the IT domain. Training and development
respectively contain 12,000 and 1,000 triplets (available soon), while the
test set 2,000 instances. All data is provided by the EU project QT21 (
http://www.qt21.eu/).

Systems' performance will be evaluated with respect to their capability to
reduce the distance that separates an automatic translation from its
human-revised version. Such distance will be measured in terms of TER,
which will be computed between automatic and human post-edits in
case-sensitive mode. Also BLEU will be taken into consideration as a
secondary evaluation metric.

To gain further insights on final output quality, a subset of the outputs
of the submitted systems will also be manually evaluated.

--------------------------------------------------------------------

DIFFERENCES FROM THE FIRST PILOT ROUND

Compared to the the pilot round, the main differences are:


   -

   the domain specificity (from news to IT);
   -

   the target language (from Spanish to German);
   -

   the post-editors (from crowdsourced workers to professional translators);
   -

   the evaluation metrics (from case-sensitive/insensitive TER to
case-sensitive
   TER and BLEU);
   -

   the performance analysis (from automatic metrics to automatic metrics
   plus manual evaluation).

--------------------------------------------------------------------

IMPORTANT DATES

Release of training data: February 22, 2016

Test set distributed: April 18, 2016

Submission deadline: April 24, 2016

Paper submission deadline: May 8, 2016

Manual evaluation: May 2016

Notification of acceptance: June 5, 2016

Camera-ready deadline: June 22, 2016

For any information or question on the task, please send an email to:
wmt-ape at fbk.eu To be always updated about the APE pilot task, you can
also join the wmt-ape group:
http://groups.google.com/a/fbk.eu/forum/#!forum/wmt-ape

--------------------------------------------------------------------

ORGANIZERS

Rajen Chatterjee (Fondazione Bruno Kessler)

Matteo Negri (Fondazione Bruno Kessler)

Raphael Rubino (Saarland University)

Marco Turchi (Fondazione Bruno Kessler)
Marcos Zampieri (Saarland University)

-- 
-Regards,
 Rajen Chatterjee.
_______________________________________________
Mt-list site list
[email protected]
http://lists.eamt.org/mailman/listinfo/mt-list

Reply via email to