CLEF Lab Overview Paper Available

Our CLEF lab overview paper is now available online.

In this paper we report on the first Living Labs for Information Retrieval Evaluation (LL4IR) CLEF Lab. Our main goal with the lab is to provide a benchmarking platform for researchers to evaluate their ranking systems in a live setting with real users in their natural task environments. For this first edition of the challenge we focused on two specific use-cases: product search and web search. Ranking systems submitted by participants were experimentally compared using interleaved comparisons to the production system from the corresponding use-case. In this paper we describe how these experiments were performed, what the resulting outcomes are, and conclude with some lessons learned.

title = {Overview of the Living Labs for Information Retrieval Evaluation (LL4IR) CLEF Lab 2015},
author = {Anne Schuth and Krisztian Balog and Liadh Kelly},
year = {2015},
date = {2015-09-08},
booktitle = {CLEF 2015},
publisher = {Springer},

Updates to API and Website

We’ve just made some improvements to our API and website

  • We are deploying the API using Tornado now, which should give a performance boost
  • Errors and warnings are now kept track off using Rollbar, this will allow us to act faster when something goes wrong
  • If there are errors, you will receive a traceback with link to the code on bitbucket making it easier for you to understand what went wrong
  • Our Website/Documentation/Dashboard now each have their own color scheme, making it easier to navigate

Please let us know what you think!

Test period begins for CLEF 2015 Lab

Today, we are officially entering into the test phase of the LL4IR CLEF 2015 Lab.

There are some changes, detailed on this page, below is an executive summary:

  • Only 2 of the 3 use-cases will be evaluated: product search and web search (local-domain search will not run this year). These two use-cases are fully operational; see this page for updated technical details on the product search use-case.
  • The test period will run from May 1 to 15. Rankings for test queries may be uploaded (and changed) until Apr 30, 23:59 CET.
  • Training queries will continue to run during the test phase and feedback will be made available for them; however, it will not be possible to upload new rankings (either for training or for test queries) during the test period.

If you are planning on submitting a run, please let us know by sending us a short message.