We are pleased to announce a new joint track between CLEF NEWSREEL and CLEF Living Labs for Information Retrieval labs. This track calls for people’s ideas on how the living labs methodology might be further developed into the future. This includes challenge discussion, people’s ideas related to existing living labs tasks and also ideas for new living labs tasks describing new use-case and design ideas related to living labs. These ideas can be speculative, opening up new challenges in the space or more concrete task proposals for future tasks the authors would like to run.

For this track, ideas should be presented in the form of a position statement (max. 2 pages).

Submission deadline: June 27th, 2016.

See New Ideas Track for full details.

June 11th, 2016

Posted In: News

Leave a Comment

The training phase for the CLEF LL4IR 2016 lab starts next week. The test phase will run from 5-19 May.

This year the CLEF lab runs with one use case, product search. See CLEF LL4IR’16 for full details.

April 22nd, 2016

Posted In: challenge, CLEF

Leave a Comment

lilaECIR 2016 in Padova will feature LiLa 2016, the first edition of our tutorial on Living Labs for Online Evaluation.

October 21st, 2015

Posted In: News

Leave a Comment

trecopensearch-darkerOur CLEF LL4IR will have a sister running at TREC. The TREC OpenSearch track has been accepted. More info will follow soon, we’ll be running TREC and CLEF in parallel.

October 21st, 2015

Posted In: News

Leave a Comment

ll4ir16

Living Labs for IR Evaluation will be running again at CLEF.

September 11th, 2015

Posted In: Uncategorized

Leave a Comment

The results for LL4IR Round #4 have just been made available on this page.

The next evaluation round will begin on the 15th of September. See our Challenge page for details.

We hope to see many of you at CLEF 2015 next week. If you are looking for some reading in the meantime, here is our Extended Lab Overview paper.

September 3rd, 2015

Posted In: challenge

Leave a Comment

The results for LL4IR Round #3 have just been made available on this page.

Also, you can (still) join us at any moment! We’ll be running evaluation rounds the last two weeks of every month, please find details on our Challenge page.

July 31st, 2015

Posted In: challenge

Leave a Comment

We just received some good news from ESF ELIAS. We received funding to host of our  LL4IR API. There will also be funding to hire someone for development and maintenance of our API. So keep your ideas and bug reports coming in through our issue tracker.

Stay tuned, things will be faster soon!

July 2nd, 2015

Posted In: challenge, LL14

Leave a Comment

The results for LL4IR Round #2 have just been made available on this page.

Also, you can (still) join us at any moment! We’ll be running evaluation rounds the last two weeks of every month, please find details on our Challenge page.

July 1st, 2015

Posted In: LLC

Leave a Comment

Our CLEF lab overview paper is now available online.

Abstract
In this paper we report on the first Living Labs for Information Retrieval Evaluation (LL4IR) CLEF Lab. Our main goal with the lab is to provide a benchmarking platform for researchers to evaluate their ranking systems in a live setting with real users in their natural task environments. For this first edition of the challenge we focused on two specific use-cases: product search and web search. Ranking systems submitted by participants were experimentally compared using interleaved comparisons to the production system from the corresponding use-case. In this paper we describe how these experiments were performed, what the resulting outcomes are, and conclude with some lessons learned.

Bibtex
@inproceedings{schuth_2015_overview,
title = {Overview of the Living Labs for Information Retrieval Evaluation (LL4IR) CLEF Lab 2015},
author = {Anne Schuth and Krisztian Balog and Liadh Kelly},
year = {2015},
date = {2015-09-08},
booktitle = {CLEF 2015},
publisher = {Springer},
}

June 29th, 2015

Posted In: Uncategorized

Leave a Comment

Next Page »