[Asis-l] Forwarded mail....
armstroc at ULV.EDU
armstroc at ULV.EDU
Wed Apr 10 16:04:43 EDT 2002
Approved:nointel
> > From: norbert_fuhr [mailto:fuhr at cs.uni-dortmund.de]
> > Sent: 05 March 2002 17:08
> >
> >
> > Evaluation initiative for XML document retrieval
> > Call for Participation
> >
> > April 2002 - December 2002
> >
> > The DELOS Network of Excellence for Digital Libraries invites
> > participation in an evaluation initiative for XML document
> > retrieval. The widespread use of XML in digital libraries,
> > product catalogues, scientific data repositories and across
> > the Web prompted the development of appropriate searching and
> > browsing methods for XML documents. This initiative provides
> > an opportunity for participants to evaluate their retrieval
> > methods using uniform scoring procedures and a forum for
> > participating organisations to compare their results. The
> > invitation is open to all research groups with an interest in
> > XML retrieval.
> >
> > As part of a large-scale effort to improve the efficiency of
> > research in information retrieval and digital libraries, this
> > project initiates an international, coordinated effort to
> > promote evaluation procedures for content-based XML
> > retrieval. Participating organisations will contribute to the
> > construction of a large testbed (test collection) of XML
> > documents. The test collection will provide participants a
> > means for future comparative and quantitative experiments.
> > Due to copyright issues, only participating organisations
> > will have access to the constructed test collection.
> > Participants will be expected to work with approximately 1
> > gigabyte of data (over 10,000 XML documents). A central
> > infrastructure, called the clearinghouse, will provide
> > guidelines and Help Desk support to all participants,
> > distribute and collect all data required and produced
> > throughout the project. It will also analyse and evaluate the
> > results of the participants' retrieval systems using uniform
> > scoring procedures. Participants will present their
> > approaches and final results at the final workshop in
> > December. All results will be published in the workshop
> > proceedings and on the Web.
> >
> > Overview
> >
> > The aim of this initiative is to provide means, in the form
> > of a test collection and appropriate scoring methods, for the
> > evaluation of retrieval of XML documents. A test collection
> > consists of a document collection, tasks/queries, and
> > relevance judgments given by
> > (potential)
> > users with these tasks. We plan a collaborative effort to
> > derive the tasks/queries and relevance judgments for a large
> > collection of XML documents. Based on the constructed test
> > collection and using uniform scoring procedures the retrieval
> > methods of participating organisation will be evaluated and
> > compared against each other.
> >
> > Documents
> >
> > The initiative is supported by the IEEE Computer Society
> > (http://www.computer.org). The set of documents for the test
> > collection is made up of scientific articles from journals
> > and proceedings of the IEEE Computer Society covering range
> > of topics in the field of computer science. The collection
> > contains approximately 10-15 thousand articles from over 20
> > different journals/proceedings from the seven-year period of
> > 1995-2001.
> >
> > The document collection will be made available to all
> > participating parties (on CD and via FTP). Prior to its
> > release the clearinghouse will prepare the document
> > collection by assigning unique element identification numbers
> > to each and every retrievable element, as well as ensuring a
> > proper format of the data. Due to copyright issues,
> > participants will have to sign a data handling agreement
> > before the collection is released to them.
> >
> > Topics/Queries
> >
> > The queries/topics will be created by the participating
> > parties. Each participant will be asked to create a set of
> > candidate topics/queries that form a representative range of
> > real user needs over the XML collection. Due to the nature of
> > the collection, the topics should be created by persons with
> > expertise in computer science or computer engineering. The
> > queries may be content-only or content-and-structure queries,
> > and broad or narrow topic queries. All candidate topics will
> > be sent to the clearinghouse. From this set the clearinghouse
> > will derive the final, approximately 50, topics and
> > redistribute them to all participants.
> >
> > Tasks
> >
> > The task, to be performed with the data and the final
> > topics/queries, will be the ad-hoc retrieval of XML
> > documents. The answer to a query should consist of a ranked
> > list of XML elements. The top 100 elements in the ranking
> > will be submitted to the clearinghouse. The clearinghouse
> > will then merge the results obtained from all participants
> > into a pool for relevance assessments. These pools of
> > retrieved elements will then be given back to the
> > participants for relevance judgment.
> >
> > In addition to the ad-hoc task, each participating group may
> > consider other specific tasks (e.g. interactive retrieval,
> > possibly with browsing and/or relevance feedback) and may
> > submit a second run for a task they would like to investigate.
> >
> > Relevance assessments
> >
> > The relevance judgments are of critical importance to a test
> > collection. For each topic it is necessary to compile a
> > comprehensive list of relevant elements. Relevance
> > assessments will be provided by the participating groups and
> > should be made by the persons who originally created that
> > topic. Each assessor will judge approximately 5 topics,
> > either the topics that they originally created or if these
> > were removed from the final set of topics, then topics that
> > were close to their original queries.
> >
> > Evaluation
> >
> > Evaluation of the retrieval effectiveness of the search
> > engines used by the participants will be based on the
> > constructed test collection and uniform scoring techniques,
> > including recall/precision measures, which take into account
> > the structural nature of XML documents. Other measures, which
> > consider "near misses", when an element near one that has
> > been assessed relevant has been retrieved, will also be used.
> >
> > The results will be returned to all participants.
> > Participating organisations will present their approaches and
> > compare their results at the workshop in December. All
> > results will be published in the proceedings of the workshop
> > and on the Web.
> >
> > Schedule
> >
> > April 15: Deadline for the submission of "Application for
> > Participation" (described below).
> >
> > April 15 - May 1: The collection of XML documents will be
> > distributed to all participants on the receipt of their
> > signed data handling agreement. Participants will also be
> > provided with detailed instructions and formatting criteria
> > for candidate topics/queries.
> >
> > May 6: Submission deadline for candidate topics.
> >
> > May 13: Distribution of final set of topics/queries to
> > participants along with detailed information on the
> > formatting requirements of the search results.
> >
> > August 1: Submission deadline of search results.
> >
> > August 19: Distribution of merged results to participants for
> > relevance assessments.
> >
> > October 1: Submission deadline for relevance assessments.
> >
> > November 1: Distribution of XML test collection and
> > evaluation scores to participants.
> >
> > December 9-11: Workshop in Schloss Dagstuhl (http://www.dagstuhl.de/).
> >
> > Organisers
> >
> > DELOS Network of Excellence for Digital Libraries
> > http://delos-noe.iei.pi.cnr.it/
> >
> > Project Leader
> >
> > Professor Norbert Fuhr
> > University of Dortmund
> > Computer Science VI
> > August-Schmidt-Straße 12
> > 44227 Dortmund (Germany)
> > http://ls6.cs.uni-dortmund.de/ir
> > Email: fuhr at ls6.informatik.uni-dortmund.de
> >
> > Contact person (clearinghouse)
> >
> > Gabriella Kazai
> > Department of Computer Science
> > Queen Mary University of London
> > Mile End Road
> > London, E1 4NS
> > Tel: (+44) 20 7882 5256
> > Fax: (+44) 20 8980 6533
> > http://www.dcs.qmul.ac.uk/research/imc/
> > http://qmir.dcs.qmw.ac.uk
> > Email: gabs at dcs.qmul.ac.uk
> >
> > Mounia Lalmas
> > Department of Computer Science
> > Queen Mary University of London
> > Mile End Road
> > London, E1 4NS
> > Tel: (+44) 20 7882 5200
> > Fax: (+44) 20 8980 6533
> > http://www.dcs.qmul.ac.uk/research/imc/
> > http://qmir.dcs.qmw.ac.uk
> > Email: mounia at dcs.qmul.ac.uk
> >
> > Application to participate
> >
> > Organisations wishing to participate should respond to this
> > call by submitting their application on-line at
> > http://qmir.dcs.qmw.ac.uk/XMLEval.html#App or by filling in
> > the application below and sending it via email to Gabriella
> > Kazai at gabs at dcs.qmul.ac.uk. All responses should be
> > submitted by the 15th of April. Confirmation of the receipt
> > of your application will be sent via email within 3 working
> > days. Any questions should also be sent to same address.
> >
> > ---Top of Form---
> > Contact Information
> > Main Contact
> > Name:
> >
> > Affiliation:
> >
> > Tel. Number(s):
> >
> > Fax Number(s):
> >
> > Email Address(es):
> >
> > Full Postal Address:
> >
> > Additional Contact
> > Name:
> >
> > Affiliation:
> >
> > Tel. Number(s):
> >
> > Fax Number(s):
> >
> > Email Address(es):
> >
> > Full Postal Address:
> >
> > Description of retrieval approach:
> > (Please provide details of your retrieval approach and interest in XML
> > retrieval)
> >
> >
> > Other Tasks:
> > (Please indicate here what additional tasks would you be interested
> > in)
> >
> >
__________________________________________________
Do You Yahoo!?
Try FREE Yahoo! Mail - the world's greatest free email!
http://mail.yahoo.com/
More information about the Asis-l
mailing list