[Asis-l] CFP: Special Issue of Information Retrieval, a Springer journal, on Crowdsourcing for Information Retrieval

Matt Lease ml at ischool.utexas.edu
Sun Nov 14 09:49:01 EST 2010


Information Retrieval, a Springer journal
Special Issue on Crowdsourcing for Information Retrieval

Call for papers: http://ir.ischool.utexas.edu/irj-crowd-cfp.pdf

Submissions due: March 31, 2011
============================

Introduction
------------
The advent of crowdsourcing is revolutionizing information technology by 
enabling a wide host of new methodology for data collection, annotation, 
and processing, as well as system design and evaluation. Given the 
massive datasets that are both ubiquitous and rapidly growing in today’s 
digital age, the field of information retrieval (IR) stands to 
particularly benefit from such advances as academic researchers and 
industrial practitioners alike leverage the wisdom of crowds to more 
effective cope with and exploit massive datasets. The novelty of 
crowdsourcing makes this is an especially exciting time to focus the IR 
community’s attention toward it and provide an opportunity for 
discussion and education on this key emerging area. Traditionally 
manual-labor intensive processes have been particularly impacted by 
dramatically reducing the time, cost, and effort involved. In IR, this 
has driven a disruptive shift in areas like:

Evaluation: The Cranfield paradigm for evaluating search engines 
requires manually assessing document relevance to search queries. Recent 
work on stochastic evaluation has reduced but not removed this 
dependence on manual assessment.

Supervised Learning: While traditional costs associated with data 
annotation have driven recent machine learning work (e.g. Learning to 
Rank) toward greater use of unsupervised and semi-supervised methods, 
the emergence of crowdsourcing has made labeled data far easier to 
acquire, thereby driving a potential resurgence in supervised or 
semi-supervised methods.

Applications: Crowdsourcing has introduced exciting new opportunities to 
integrate human labor into automated systems: handling difficult cases 
where automation fails, exploiting the breadth of backgrounds, 
geographic dispersion, and real-time crowd response, etc.

While existing studies on crowdsourcing for IR have been encouraging, a 
variety of important questions remain open with regard to theory, 
methodology, policy, and best practices.  See call for additional 
background (http://ir.ischool.utexas.edu/irj-crowd-cfp.pdf).


Call for Papers
---------------
The special issue welcomes novel, high-quality manuscripts on the 
development, evaluation, and theoretical analysis of crowdsourcing for 
IR. Submissions to the special issue should not be under consideration 
in any other journal or conference and will be evaluated according to 
the Information Retrieval Journal reviewing criteria and appropriateness 
to the special issue. If the submission is a revision of a previous 
conference paper, the revision must contain significant amplification or 
clarification of the original material or there is some significant 
additional benefit to be gained. For more details, please refer to 
"manuscript submission" on the journal homepage. Submissions should use 
the Information Retrieval Journal style templates available from the 
Journal's homepage and should be submitted through the IR Journal online 
submission page, selecting the “S.I.: Crowdsourcing for Info. Ret.” 
article type.

Topics of interest include but are not limited to the following areas:

* Theoretical, experimental, and/or methodological developments 
advancing state-of-the-art knowledge of crowdsourcing for IR

* Novel applications of IR enabled by crowdsourcing (e.g. use of 
real-time crowd response, using human computation to “close the loop” 
with automation for hybrid IR systems, etc.)

* Tutorials or best practices on employing crowdsourcing to support 
different IR scenarios

* Innovative ideas for better using crowdsourcing platforms such as 
Amazon’s Mechanical Turk, Crowdflower, etc.

* Description of novel software packages improving ease and/or quality 
of crowdsourcing, (e.g. TurkIt, qmturk, Turk Surveyor, etc.)

* Reflective or forward-looking vision on use of crowdsourcing in this area


Important dates
---------------
Submissions due: March 31, 2011
Reviewer feedback to authors: May 15, 2011
Revised submissions due: May 31, 2011
Notification of acceptance and final reviewer feedback: June 30, 2011
Final submissions due: July 15, 2011


Guest Editors
-------------
All questions regarding submissions should be directed to the special 
issue Guest Editors:

Vitor Carvalho, Microsoft Bing
http://www.cs.cmu.edu/~vitor

Matthew Lease, University of Texas at Austin
http://www.ischool.utexas.edu/~ml

Emine Yilmaz, Microsoft Research
http://research.microsoft.com/en-us/people/eminey

-- 
Matt Lease
Assistant Professor
School of Information and Department of Computer Science
University of Texas at Austin
Voice: (512) 471-9350 · Fax: (512) 471-3971 · Office: UTA 5.450
http://www.ischool.utexas.edu/~ml



More information about the Asis-l mailing list