Evaluating Focused Retrieval Tasks - INRIA - Institut National de Recherche en Informatique et en Automatique Accéder directement au contenu
Communication Dans Un Congrès Année : 2007

Evaluating Focused Retrieval Tasks

Résumé

Focused retrieval, identified by question answering, passage retrieval, and XML element retrieval, is becoming increasingly important within the broad task of information retrieval. In this paper, we present a taxonomy of text retrieval tasks based on the structure of the answers required by a task. Of particular importance are the in context tasks of focused retrieval, where not only relevant documents should be retrieved but also relevant information within each document should be correctly identified. Answers containing relevant information could be, for example, best entry points, or non-overlapping passages or elements. Our main research question is: How should the effectiveness of focused retrieval be evaluated? We propose an evaluation framework where different aspects of the in context focused retrieval tasks can be consistently evaluated and compared, and use fidelity tests on simulated runs to show what is measured. Results from our fidelity experiments demonstrate the usefulness of the proposed evaluation framework, and show its ability to measure different aspects and model different evaluation assumptions of focused retrieval.
Fichier principal
Vignette du fichier
jovanp-Focused.pdf (165.36 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

inria-00166790 , version 1 (09-08-2007)

Identifiants

  • HAL Id : inria-00166790 , version 1

Citer

Jovan Pehcevski, James A. Thom. Evaluating Focused Retrieval Tasks. SIGIR 2007 Workshop on Focused Retrieval, Jul 2007, Amsterdam, Netherlands. ⟨inria-00166790⟩
195 Consultations
121 Téléchargements

Partager

Gmail Facebook X LinkedIn More