Evaluation Methodologies in Information Retrieval (Dagstuhl Seminar 13441)

Authors Maristella Agosti, Norbert Fuhr, Elaine Toms, Pertti Vakkari and all authors of the abstracts in this report



PDF
Thumbnail PDF

File

DagRep.3.10.92.pdf
  • Filesize: 0.8 MB
  • 35 pages

Document Identifiers

Author Details

Maristella Agosti
Norbert Fuhr
Elaine Toms
Pertti Vakkari
and all authors of the abstracts in this report

Cite AsGet BibTex

Maristella Agosti, Norbert Fuhr, Elaine Toms, and Pertti Vakkari. Evaluation Methodologies in Information Retrieval (Dagstuhl Seminar 13441). In Dagstuhl Reports, Volume 3, Issue 10, pp. 92-126, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2014)
https://doi.org/10.4230/DagRep.3.10.92

Abstract

This report documents the program and the outcome of Dagstuhl Seminar 13441 "Evaluation Methodologies in Information Retrieval", which brought together 42 participants from 11 countries. The seminar was motivated by the fact that today's information retrieval (IR) applications can hardly be evaluated based on the classic test collection paradigm, thus there is a need for new evaluation approaches. The event started with five introductory talks on evaluation frameworks, user modeling for evaluation, evaluation criteria, measures, evaluation methodology, and new trends in IR evaluation. The seminar participants then formed working groups addressing specific aspects of IR evaluation, such as reliability and validity, task-based IR, learning as search outcome, searching for fun, IR and social media, graph search, domain-specific IR, interaction measures and models, and searcher-aware information access systems.
Keywords
  • evaluation
  • testbeds
  • user studies
  • measures

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail