[Asis-l] MIREX Grand Challenge '14 User Experience (GC14UX): Volunteer graders sought

Downie, J Stephen jdownie at illinois.edu
Mon Oct 13 14:04:52 EDT 2014


Dear Colleagues:

The Music Information Retrieval Evaluation eXchange (MIREX) Grand Challenge '14 User Experience (GC14UX) task requires volunteer graders to evaluate system outputs. 

The goals of this grand challenge are to inspire the development of complete MIR systems and to promote the notion of user experience as a first-class research objective in the MIR community. Your assistance will help future researchers and developers build better and more useful music retrieval systems that better meet the needs of real users.

We expect each grader to spend approximately 20 - 30 minutes exploring each submitted system. There are three systems this year to be evaluated.

Your participation as an evaluator is completely voluntary. 

For more information about MIREX, please see:
http://www.music-ir.org/mirex/wiki/MIREX_HOME


For more information about GC14UX and its task instructions:
http://www.music-ir.org/mirex/gc14ux/

We are opening evaluations today, 13 October 2014, and will close evaluations Tuesday, 21 October 2014. So, if you are kind enough to sign up to be a grader, PLEASE understand that we REALLY need you complete your assigned grading by 21 October. If you are a GC14UX participant, we ask that you do what you can to encourage adults over 18 years of age to be graders.

To register as a grader, please go to:
http://www.music-ir.org/mirex/gc14ux/register.php

If you have any questions, please contact <mirex at imirsel.org> or <jdownie at illinois.edu>


Thank you very much everybody.

Cheers,
Stephen

**********EVALUTION INSTRUCTIONS****************************
To evaluate the asigned MIR system we ask you to imagine you have a personal video that you are editing, and you wish to find a suitable audio track to add to the video.

In performing this task, please focus on evaluating the interaction and experience with the system as a whole, and not just the results you get. Please be aware that these systems are using an open-source test dataset (Jamendo) that is a collection of open-source music, and therefore the results may not include popular music that many of us are familiar with.

After signing up as an evaluator for GC14UX you will be assigned a number of MIR systems to evaluate (by clicking the "Get Assignment" button). The five questions we ask you to consider are:

Overall satisfaction:
How would you rate your overall satisfaction with the system?

Learnability: 
How easy was it to figure out how to use the system?

Robustness:
How good is the system's ability to warn you when you're about to make a mistake, allow you to recover, or retrace your step?

Affordances: 
How well does the system allow you to perform what you want to do?

Feedback: 
How well does the system communicate what's going on?

There will also be the opportunity to enter general comments in a free-form text box.

Once you have completed all the questions for a given MIR system there is the opportunity to return to it and alter any of the ratings you have given it. In our analysis of the results we will take the final set of ratings that were saved.

In addition to simply rating the systems, we highly encourage evaluators to provide their feedback in free text. Your comments will not only help us understand important factors affecting user experience, but also help the participants to improve the design of their systems.

********************************************************** 
   "Research funding makes the world a better place"
********************************************************** 
J. Stephen Downie, PhD     
Associate Dean for Research     
Professor     
Graduate School of Library and Information Science    
University of Illinois at Urbana-Champaign    
[Vox/Voicemail] (217) 649-3839    




More information about the Asis-l mailing list