Please use this identifier to cite or link to this item: http://dx.doi.org/10.14279/depositonce-7170
Main Title: Report on the SIGIR 2013 workshop on benchmarking adaptive retrieval and recommender systems
Author(s): Castells, Pablo
Hopfgartner, Frank
Said, Alan
Lalmas, Mounia
Type: Article
Language Code: en
Abstract: In recent years, immense progress has been made in the development of recommendation, retrieval, and personalisation techniques. The evaluation of these systems is still based on traditional information retrieval and statistics metrics, e.g., precision, recall and/or RMSE, often not taking the use-case and situation of the actual system into consideration. However, the rapid evolution of recommender and adaptive IR systems in both their goals and their bapplication domains foster the need for new evaluation methodologies and environments. In the Workshop on Benchmarking Adaptive Retrieval and Recommender Systems, we aimed to provide a platform for discussions on novel evaluation and benchmarking approaches.
URI: https://depositonce.tu-berlin.de//handle/11303/8007
http://dx.doi.org/10.14279/depositonce-7170
Issue Date: 2013
Date Available: 10-Jul-2018
DDC Class: 004 Datenverarbeitung; Informatik
Subject(s): recommender systems
information retrieval
adaptive retrieval
benchmarking
CLEF
License: http://rightsstatements.org/vocab/InC/1.0/
Journal Title: ACM SIGIR Forum
Publisher: Association for Computing Machinery (ACM)
Publisher Place: New York, NY
Volume: 47
Issue: 2
Publisher DOI: 10.1145/2568388.2568398
Page Start: 64
Page End: 67
EISSN: 0163-5840
Appears in Collections:FG Agententechnologien in betrieblichen Anwendungen und der Telekommunikation (AOT) » Publications

Files in This Item:
File Description SizeFormat 
castells_etal_2014.pdf975.73 kBAdobe PDFThumbnail
View/Open


Items in DepositOnce are protected by copyright, with all rights reserved, unless otherwise indicated.