[Apologies for multiple postings]

*THIRD CALL FOR PAPERS*

REPROLANG 2020
Shared Task on the Reproduction of Research Results in Science and Technology 
of Language (part of LREC 2020 conference)
Marseille, France
May 11-16, 2020
http://wordpress.let.vupr.nl/lrec-reproduction

REPROLANG 2020 is the Shared Task on the Reproduction of Research Results in Science and 
Technology of Language organized by ELRA<http://www.elra.info/en/>  with the technical 
support of CLARIN<https://www.clarin.eu/>, as part of the LREC 2020 
conference<https://lrec2020.lrec-conf.org/>.


BACKGROUND

Scientific knowledge is grounded on falsifiable predictions and thus its 
credibility and raison d’être relies on the possibility of repeating 
experiments and getting similar results as originally obtained and reported.  
In many young scientific areas, including ours, acknowledgement and promotion 
of the reproduction of research results need very much to be increased.

For this reason, a special track on reproducibility is included into the LREC 2020 
conference regular program (side by side with other sessions on other topics) for papers on 
reproduction of research results, and the present specific community-wide shared task is 
launched to elicit and motivate the spread of scientific work on reproduction. This 
initiative builds on the previous pioneer LREC workshops on reproducibility 4REAL 
2016<http://4real.di.fc.ul.pt/>and 4REAL 2018<http://4real2018.di.fc.ul.pt/>.

SHARED TASK

The shared task is of a new type: it is partly similar to the usual competitive 
shared tasks — in the sense that all participants share a common goal; but it 
is partly different to previous shared tasks — in the sense that its primary 
focus is on seeking support and confirmation of previous results, rather than 
on overcoming those previous results with superior ones. Thus instead of a 
competitive shared task, with each participant struggling for an individual top 
system that scores as far as possible from a rough baseline, this will be a 
cooperative shared task, with participants struggling for systems that 
reproduce as close as possible an original complex research experiment and thus 
eventually reinforcing the level of reliability on its results by means of 
their eventually convergent outcomes. Concomitantly, like with competitive 
shared tasks, in the process of participating in the collaborative shared task, 
new ideas for improvement and new advances beyond the reproduced results find 
here an excellent ground to be ignited.

We invite researchers to reproduce the results of a selected set of articles, which 
have been offered by the respective authors with their consent to be used for this 
shared task: see selected 
tasks<http://wordpress.let.vupr.nl/lrec-reproduction/shared-task-on-the-reproduction-of-research-results-in-science-and-technology-of-language/selected-tasks/>.
   Papers submitted for this task are expected to report on reproduction findings, to 
document how the results of the original paper were reproduced, to discuss 
reproducibility challenges, to inform on time, space or data requirements found 
concerning training and testing, to ponder on lessons learned, to elaborate on 
recommendations for best practices, etc.

Submissions that in addition to the reproduction exercise, report also on 
results of the replication of the selected tasks with other languages, domains, 
data sets, models, methods, algorithms, downstream tasks, etc. are also 
encouraged. These should permit to gain insight also into the robustness of the 
replicated approaches, their learning curves and potential of incremental 
performance, their capacity of generalization, their transferability across 
experimental circumstances and into eventual real-life usage scenarios, their 
suitability to support further progress, etc.

*PUBLICATION*

LREC conferences have one of the top h5-index scores of research impact among 
the world class venues for research on Human Language Technology.

Accepted papers for the shared task will be published in the Proceedings of the LREC 2020 main 
conference. LREC Proceedings are freely available from 
ELRA<http://www.elra.info/en/lrec/proceedings/>and ACL 
Anthology<https://aclanthology.info/>. They are indexed in Scopus (Elsevier) and in 
DBLP<https://dblp.uni-trier.de/db/conf/lrec/>. LREC 2010, LREC 2012 and LREC 2014 Proceedings are 
included in theThomson Reuters Conference Proceedings Citation 
Index<http://thomsonreuters.com/conference-proceedings-citation-index/>(the other editions are 
being processed).

Substantially extended versions of papers selected by reviewers as the most 
appropriate will be considered for publication in special issues of the Language 
Resources and Evaluation 
Journal<https://link.springer.com/journal/10579>published by Springer (a 
SCI-indexed journal).


SUBMISSION

Papers must be submitted through START 
@https://www.softconf.com/lrec2020/REPROLANG2020/and  will be peer-reviewed.


IMPORTANT DATES

 * November 25, 2019: deadline for paper submission (aligned with LREC 2020)
 * November 27: deadline for projects in gitlab.com to go public
 * February 14, 2020: notification of acceptance
 * May 11-16: LREC conference takes place


PRESENTATION

Papers accepted for publication will be presented in a specific session of the 
LREC main conference. There is no difference in quality between oral and poster 
presentations. Only the appropriateness of the type of communication (more or 
less interactive) to the content of the paper will be considered. The format of 
the presentations will be decided by the Program Committee. The proceedings 
will include both oral and poster papers in the same format.


REGISTRATION

For a selected paper to be included in the programme and to be published in the 
proceedings, at least one of its authors must register for the LREC 2020 
conference by the early bird registration deadline. A single registration only 
covers one paper, following the general LREC policy on registration. 
Registration service is to be found at the LREC 2020 website.


CONTACTS

About the shared task:
Piek Vossen<http://vossen.info>
piek.vos...@vu.nl  <mailto:p.t.j.m.vos...@vu.nl>

About the preparation and submission of materials:
reprolang...@clarin.eu  <mailto:reprolang...@clarin.eu>

_______________________________________________
Mt-list site list
Mt-list@eamt.org
http://lists.eamt.org/mailman/listinfo/mt-list

Reply via email to