[Asis-l] CFP: SIGIR 2011 Workshop on Crowdsourcing for Information Retrieval (DEADLINE EXTENSION)

Matt Lease ml at ischool.utexas.edu
Tue May 31 19:05:15 EDT 2011


Crowdsourcing for Information Retrieval: A SIGIR 2011 Workshop
https://sites.google.com/site/cir2011ws/

**Deadline Extension to Thursday June 9th**.

CALL FOR PAPERS

The advent of crowdsourcing is driving a disruptive shift in IR areas
such as evaluation, learning to rank, and development of new hybrid
man+machine systems which blend automation and crowd computation
(potentially in real-time) to deliver innovative functionality and
search experiences. Traditionally manual-labor intensive tasks like
judging relevance and annotating training data can now be accomplished
more quickly and accurately, and at a fraction of traditional
costs.Despite this potential and early successes of crowdsourcing on
IR, many significant challenges remain with regard to theory,
methodology, policy, and best practices that currently limit our
ability to realize this potential in practice.

We invite submissions of papers describing novel, unpublished research
addressing one or more of the following areas:

* General: Theoretical, experimental, and/or methodological developments 
advancing state-of-the-art knowledge of crowdsourcing for IR

* Applications: search blending automation with the crowd, especially
     real-time systems which must model dynamic and temporal properties 
of crowd behavior

* New functionality: use of crowdsourcing to realize innovative
     search features (e.g. using geographic dispersion of the crowd for
     local search or to detect geo-specific intents, etc.)

* Machine learning: consensus labeling for vote aggregation, active
     learning strategies for efficient labeling, learning to rank with
     noisy crowd labels and multi-labeling

* Evaluation: evaluating systems with noisy and multi-labeled
     relevance judgments

* Infrastructure: new software packages and tool kits which simplify
     or otherwise improve general support for crowdsourcing or particular
     tasks (e.g. TurkIt, Get Another Label)

* Human factors and task design: how to design effective interfaces
     and interaction mechanisms for the crowd; how to enable effective
     crowd performance on tasks traditionally requiring scare and expensive
     domain experts; how different forms of crowdsourcing or crowd
     motivations (fun, socialization, prestige, economic, etc.), might be
     selected or tailored for different IR tasks (e.g. Page Hunt)

* Vision: Reflective or forward-looking position papers on use of
     crowdsourcing for IR


IMPORTANT DATES

     Workshop Papers
      Submissions: June 9, 2011
      Notification of acceptance: July 5, 2011
      Camera-ready: July 12, 2011

Workshop: July 28, 2011


WORKSHOP ORGANIZERS

Vaughn Hester, CrowdFlower, USA
Matthew Lease, University of Texas at Austin, USA
Alex Sorokin, CrowdFlower, USA
Emine Yilmaz, Microsoft, UK

You can email the organizers at cir2011-org *at* googlegroups *dot* com.

PROGRAM COMMITTEE

Omar Alonso       Microsoft Bing
Paul Bennett      Microsoft Research
Adam Bradley   Amazon.com
Ben Carterette  University of Delaware
Charlie Clarke   University of Waterloo
Harry Halpin 	University of Edinburgh
Jaap Kamps        University of Amsterdam
Martha Larson   Delft University of Technology
Gabriella Kazai  Microsoft Research
Mounia Lalmas  University of Glasgow
Edith Law         Carnegie Mellon University
Don Metzler       University of Southern California
Stefano Mizzaro  University of Udine
Stefanie Nowak   Fraunhofer IDMT
Iadh Ounis           University of Glasgow
Mark Sanderson  RMIT University
Mark Smucker     University of Waterloo
Ian Soboroff         National Institute of Standards
Siddharth Suri    Yahoo! Research




More information about the Asis-l mailing list