[Asis-l] MIREX Grand Challenge '16 User Experience (GC16UX): Volunteer graders sought
Downie, J Stephen
jdownie at illinois.edu
Wed Feb 3 21:43:45 EST 2016
Dear Colleagues:
The Music Information Retrieval Evaluation eXchange (MIREX) Grand Challenge '16 User Experience (GC16UX) task requires volunteer graders to evaluate system outputs.
The goals of this grand challenge are to inspire the development of complete MIR systems and to promote the notion of user experience as a first-class research objective in the MIR community. Your assistance will help future researchers and developers build better and more useful music retrieval systems that better meet the needs of real users.
We expect each grader to spend approximately 20 - 30 minutes exploring each submitted system. There are three systems this year to be evaluated. Please note that "Moody", one of the submissions, only works with the Chrome web browser.
Your participation as an evaluator is completely voluntary.
For more information about MIREX, please see:
http://www.music-ir.org/mirex/wiki/MIREX_HOME
For more information about GC16UX and its task instructions:
http://www.music-ir.org/mirex/gc16ux/
We are opening evaluations today, 3 February 2015, and will close evaluations Wednesday, 24 February 2016. So, if you are kind enough to sign up to be a grader, PLEASE understand that we REALLY need you complete your assigned grading by 24 February. If you are a GC16UX participant, we ask that you do what you can to encourage adults over 18 years of age to be graders.
To register as a grader, please go to:
http://www.music-ir.org/mirex/gc16ux/register.php
If you have any questions, please contact <mirex at imirsel.org> or <jdownie at illinois.edu>
Thank you very much everybody.
Cheers,
Stephen
**********EVALUATION INSTRUCTIONS****************************
To evaluate the assigned MIR system we ask you to imagine you need to put together a playlist for a particular event (e.g., dinner party at your house, workout session, etc.). Please try to use the assigned system to make playlists for at least a couple of different events.
In performing this task, please focus on evaluating the interaction and experience with the system as a whole, and not just the results you get. Please be aware that these systems are using an open-source test dataset (Jamendo) that is a collection of Creative Commons licensed music, and therefore the results may not include popular music that many of us are familiar with.
After signing up as an evaluator for GC16UX you will be assigned a number of MIR systems to evaluate (by clicking the "Get Assignment" button). The six questions we ask you to consider are:
Overall satisfaction:
How would you rate your overall satisfaction with the system?
Aesthetics:
How would you rate the visual attractiveness of the system?
Ease of use:
How easy was it to figure out how to use the system?
Clarity:
How well does the system communicate what is going on?
Affordances:
How well does the system allow you to perform what you want to do?
Performance:
Does the system work efficiently and without bugs/glitches?
There will also be the opportunity to enter general comments in a free-form text box.
Once you have completed all the questions for a given MIR system there is the opportunity to return to it and alter any of the ratings you have given it. In our analysis of the results we will take the final set of ratings that were saved.
In addition to simply rating the systems, we highly encourage evaluators to provide their feedback in free text. Your comments will not only help us understand important factors affecting user experience, but also help the participants to improve the design of their systems.
**********************************************************
"Research funding makes the world a better place"
**********************************************************
J. Stephen Downie, PhD
Associate Dean for Research
Professor
Graduate School of Library and Information Science
University of Illinois at Urbana-Champaign
[Vox/Voicemail] (217) 649-3839
More information about the Asis-l
mailing list