[Apologies for cross-postings]

Shared Task on Customer Feedback Analysis (IJCNLP 2017)
https://sites.google.com/view/customer-feedback-analysis/

BRIEF: Training Data Available Upon Registration

For Registration, please send email to [email protected] with the
following information.
1. Team:
Institution(s)
Name(s)
2. Contact person:
Title
Last Name
First Name
Email address

Understanding and being able to react to customer feedback is the most
fundamental task in providing good customer service.  The shared task will
provide a corpus annotated using a five-class categorization (comment,
request, bug, complaint and meaningless) from a pilot study of
ADAPT-Microsoft joint research project.  Participants will train
classifiers for the detection of meanings in customer feedback in English,
Spanish and Japanese.

Chao-Hong Liu <[email protected]>
Declan Groves <[email protected]>
Alberto Poncelas <[email protected]>
Akira Hayakawa <[email protected]>
Yasufumi Moriya <[email protected]>
CONTACT: [email protected]


On Mon, May 29, 2017 at 12:46 PM, Chao-Hong Liu <[email protected]
> wrote:

> [Apologies for cross-postings]
>
> ==================================================================
> IJCNLP 2017 SHARED TASKS: JOINT CALL FOR PARTICIPATION
> December 1, 2017, Taipei, Taiwan
>
> Registration NOW Open!
> http://ijcnlp2017.org/site/page.aspx?pid=158&sid=1133&lang=en
>
> The 8th International Joint Conference on Natural Language Processing
> (IJCNLP 2017)
> http://ijcnlp2017.org/
> ==================================================================
>
> IJCNLP 2017 Shared Tasks will be held in Taipei, Taiwan, at Taipei Nangang
> Exhibitation Hall on December 1st, 2017.  The topics of the shared tasks
> cover
> 1) grammatical error diagnosis,
> 2) sentiment analysis,
> 3) product review summarization and customer feedback analysis and
> 4) question-answering.
>
> Please visit each task website as listed below for more information.
>
> 1. Shared Task on Chinese Grammatical Error Diagnosis (CGED)
>
> Participants will build systems to automatically detect the errors in
> Chinese sentences (error existence, types and position) made by
> Chinese-as-Second-Language learners.  Four error types are annotated in the
> corpus for training, i.e. redundant word, missing word, word selection and
> word ordering.  There were three previous events of the shared task (2014,
> 2015 and 2016).
>
> http://www.cged.science/
>
> Gaoqi Rao <[email protected]>
> Baolin Zhang <[email protected]>
> Endong Xun <[email protected]>
> CONTACT: [email protected]
>
> 2. Dimensional Sentiment Analysis for Chinese Phrases
>
> Given a word or phrase, participants are asked to provide a real-valued
> score from 1 to 9 for both valence and arousal dimensions, indicating the
> degree from most negative to most positive for valence, and from most calm
> to most excited for arousal.
>
> http://nlp.innobic.yzu.edu.tw/tasks/dsa_p/
>
> Liang-Chih Yu <[email protected]>
> Lung-Hao Lee <[email protected]>
> Jin Wang <[email protected]>
> Kam-Fai Wong <[email protected]>
> CONTACT: [email protected]
>
> 3. Review Opinion Diversification
>
> Participants will build systems to rank top-k reviews as a summary of
> opinions of product reviews in three different ways. The shared task will
> use a subset of Amazon SNAP product reviews dataset for experiments.  It
> contains reviews of products in different categories, e.g., 22,000,000+
> reviews of books and 7,000,000+ reviews of electronics products.
>
> https://sites.google.com/itbhu.ac.in/revopid-2017
>
> Anil Kumar Singh <[email protected]>
> Julian McAuley <[email protected]>
> Avijit Thawani <[email protected]>
> Mayank Panchal <[email protected]>
> Anubhav Gupta <[email protected]>
> Rajesh Kumar Mundotiya <[email protected]>
> CONTACT: [email protected]
>
> 4. Shared Task on Customer Feedback Analysis
>
> Understanding and being able to react to customer feedback is the most
> fundamental task in providing good customer service.  The shared task will
> provide a corpus annotated using a five-class categorization (comment,
> request, bug, complaint and meaningless) from a pilot study of
> ADAPT-Microsoft joint research project.  Participants will train
> classifiers for the detection of meanings in customer feedback in English,
> Spanish and Japanese.
>
> https://sites.google.com/view/customer-feedback-analysis/
>
> Chao-Hong Liu <[email protected]>
> Declan Groves <[email protected]>
> Alberto Poncelas <[email protected]>
> Akira Hayakawa <[email protected]>
> Yasufumi Moriya <[email protected]>
> CONTACT: [email protected]
>
> 5. Multi-choice Question Answering in Examinations
>
> Participants will build systems to choose the correct option for each
> multi-choice question.  The dataset is comprised of two parts, English and
> Chinese.  The total number of questions (along with their answers) is
> 14,447.
>
> http://www.nlpr.ia.ac.cn/cip/ijcnlp/Multi-choice_Question_An
> swering_in_Exams.html
>
> Jun Zhao <[email protected]>
> Kang Liu <[email protected]>
> Shizhu He <[email protected]>
> Zhuoyu Wei <[email protected]>
> Shangmin Guo <[email protected]>
> CONTACT: [email protected]
>
> IMPORTANT DATES
> Shared Task Website Ready: 1-May-17
> Release of Training Data: 15-May-17
> Dryrun: Release of Development Set: 12-Jul-17
> Release of Test Set: 14-Aug-17
> Submission of Systems: 21-Aug-17
> System Description Paper Due: 15-Sep-17
> Camera-Ready Deadline: 10-Oct-17
>
> SHARED TASK CO-CHAIRS
> Chao-Hong Liu, ADAPT Centre, Dublin City University, chaohong.liu@
> adaptcentre.ie
> Preslav Nakov, Qatar Computing Research Institute, [email protected]
> Nianwen Xue, Brandies University, [email protected]
>
> *Chao-Hong Liu* | Postdoctoral Researcher
> ADAPT Centre
> School of Computing m: +353 (0) 89 247 3035 <+353%2089%20247%203035>
> Dublin City University e: [email protected]
> Dublin 9, Ireland www.adaptcentre.ie
> <https://twitter.com/adaptcentre>
> <https://www.facebook.com/ADAPTCentre?fref=ts>
> <https://www.youtube.com/channel/UC9--qVutTtyLyhZCJR7rY5g>
> <https://www.linkedin.com/company/adapt-centre>
>
_______________________________________________
Mt-list site list
[email protected]
http://lists.eamt.org/mailman/listinfo/mt-list

Reply via email to