[Asis-l] CFP SIGIR 2009 Workshop on the Future of IR Evaluation - New Deadline

Nicholas Belkin belkin at rutgers.edu
Thu May 21 10:05:11 EDT 2009


SIGIR 2009 Workshop on the Future of IR Evaluation
July 23, Boston
http://staff.science.uva.nl/~kamps/ireval/

Submissions due: June 15

Call for Papers

Evaluation is at the core of information retrieval: virtually all
progress owes directly or indirectly to test collections built within
the so-called Cranfield paradigm.  However, in recent years, IR
researchers are routinely pursuing tasks outside the traditional
paradigm, by taking a broader view on tasks, users, and context.
There is a fast moving evolution in content from traditional static
text to diverse forms of dynamic, collaborative, and multilingual
information sources.  Also industry is embracing "operational"
evaluation based on the analysis of endless streams of queries and
clicks.

We invite the submission of papers that think outside the box:

- Are you working on an interesting new retrieval task or aspect?  Or
    on its broader task or user context?  Or on a complete system with
    novel interface?  Or on interactive/adaptive search?  Or ...?
    Please explain why this is of interest, and what would be an
    appropriate way of evaluating.

- Do you feel that the current evaluation tools fail to do justice to
    your research?  Is there a crucial aspect missing?  Or are you
    interested in specific, rare, phenomena that have little impact on
    the average scores?  Or ...?  Please explain why this is of
    interest, and what would be an appropriate way of evaluating.

- Do you have concrete ideas how to evaluate such a novel IR task?  Or
    ideas for new types of experimental or operational evaluation?  Or
    new measures or ways of re-using existing data?  Or ...?  Please
    explain why this is of interest, and what would be an appropriate
    way of evaluating.

The workshop brings together all stake-holders ranging from those with
novel evaluation needs, such as a PhD candidate pursuing a new
IR-related problem, to senior IR evaluation experts.  Desired outcomes
are insight into how to make IR evaluation more "realistic," and at
least one concrete idea for a retrieval track or task (at CLEF, INEX,
NTCIR, TREC) that would not have happened otherwise.

Help us shape the future of IR evaluation!

- Submit a short 2-page poster or position paper explaining your key
    wishes or key points,

- and take actively part in the discussion at the Workshop.

The *revised* deadline is Monday June 15, 2009, further submission
details are on http://staff.science.uva.nl/~kamps/ireval/


Shlomo Geva, INEX & QUT, Australia
Jaap Kamps, INEX & University of Amsterdam, The Netherlands
Carol Peters, CLEF & ISTI-CNR, Italy
Tetsuya Sakai, NTCIR & Microsoft Research Asia, China
Andrew Trotman, INEX & University of Otago, New Zealand
Ellen Voorhees, TREC/TAC & NIST, USA



-- 
Nicholas J. Belkin
Professor II
Department of Library and Information Science
School of Communication, Information & Library Studies
Rutgers University
4 Huntington Street
New Brunswick, NJ 08901-1071, USA
email: belkin at rutgers.edu
tel: +1 732 932 7500 x8271
url: http://scils.rutgers.edu/~belkin/belkin.html




-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: message-footer.txt
Url: http://mail.asis.org/pipermail/asis-l/attachments/20090521/f32d6175/attachment.txt 


More information about the Asis-l mailing list