Evaluating success in search systems

Edie Rasmussen, Elaine Toms, Bernard James Jansen, Gheorghe Muresan

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

The focus of this panel is on methodologies and measures for evaluating information retrieval (IR) systems from a human-centred perspective. Current research especially with regard to search engines is challenged by “Internet time” – the need for near instantaneous results that are also reliable and valid. The session will begin with an assessment of the current status of IR evaluation, followed by presentations on emerging methods used in recent evaluations, as well as on types of data collected and the measures used for analysis. A discussion among the panelists and the audience will critique current methods, and suggest how those methods may be enhanced. The outcome from this panel will be a fresh critical examination of IR evaluation methods.
Original languageEnglish
Title of host publicationProceedings of the American Society for Information Science and Technology
Publication statusPublished - 2005
Externally publishedYes

Fingerprint

Dive into the research topics of 'Evaluating success in search systems'. Together they form a unique fingerprint.

Cite this