*Timelines:*

- May 1st: 2023 queries released to participants for all tasks
- August 1st: Submissions close for all tasks
- November 12-14th: TREC conference

*Website and task details:* https://trec-product-search.github.io/

*Trec task Registration:*
https://ir.nist.gov/trecsubmit.open/application.html

*Introduction:*

The Product Search Track studies information retrieval in the field of
product search. This is the case where there is a corpus of many products
where the user goal and intent is to find the product that suits their need.

The main goal of the Trec Product Search Track is to study how end to end
retrieval systems can be built and evaluated given a large set of products.

*Track Tasks:*

The product search track has three tasks: *ranking*, *end to end retrieval,*
 and* multi modal end to end retrieval*. You can submit up to three runs
for each of these tasks.

Each track uses the same training data originating from the ESCI Challenge
for Improving Product Search and shares the same set of evaluation queries.

Below the three tasks are described in more detail.

*Product Ranking Task*

The first task focuses on product ranking. In this task we provide an
initial ranking of 1000 documents from a BM25 baseline and you are expected
to re-rank the products in terms of their relevance to the users given
intent.

The ranking provides a focused task where the candidate sets are fixed and
there is no need to implement complex end to end systems which makes
experimentation quick and runs easily comparable.

*Product Retrieval Task*

The second task focuses on end to end product retrieval. In this task we
provide a large collection of products and participants need to design end
to end retrieval systems which leverage whichever information they find
relevant/useful.

Unlike the ranking task, the focus here is in understanding the interplay
between retrieval and reranking systems.

*Multi-Modal Product Retrieval Task*

The third task focuses on end to end product retrieval using multiple
modalities. In this task we provide a large collection of products where
each product features additional attributes and information such as related
clicks and images and participants need to design end to end retrieval
systems which leverage whichever information they find relevant/useful.

The focus of this task is to understand the interplay between different
modalities and the value which additional potentially weak data provides.

*Timelines:*

- May 1st: 2023 queries released to participants for all tasks
- August 1st: Submissions close for all tasks
- November 12-14th: TREC conference

*Coordinators*
- Daniel Campos (University of Illinois)
- Surya Kallumadi(Lowes)
- Corby Rosset (Microsoft)
- ChengXiang Zhai (University of Illinois)
- Alessandro Magnani (Walmart)

For any questions, comments, or suggestions please email
[email protected]

*Website and details:* https://trec-product-search.github.io/

*Trec task Registration:*
https://ir.nist.gov/trecsubmit.open/application.html
_______________________________________________
Corpora mailing list -- [email protected]
https://list.elra.info/mailman3/postorius/lists/corpora.list.elra.info/
To unsubscribe send an email to [email protected]

Reply via email to