[Sigia-l] User Test Cost - Does this sound reasonable?
Jared M. Spool
jspool at uie.com
Mon May 24 07:31:54 EDT 2004
At 04:23 PM 5/23/2004, Kevin Cheng wrote with regards to my comments about
avoiding reports and having the team attend tests or meet as an alternative:
>I agree that this is a nice ideal but practically speaking, many development
>teams are quite large or spread out, especially given today's trend towards
>telecommuters and outsourcing. For products like Microsoft Word, for
>example, you probably have a great deal of developers, each siloed into
>sub-teams themselves. On a project I was on last year, I was involved with
>developers in London, Bangalore, and Austin.
>
>Having them attend usability tests, or the second best option of meeting
>with them soon after, while undeniably useful, isn't always pragmatic.
In my mind, the purpose of usability testing is to inform the design team,
so that the decisions they make are as enlightened as possible. The less
informed the design team is, the more likely they'll make a poor decision.
The problem with traditional usability reports is that they don't really do
a very good job of informing the team. Even when written clearly and
concisely, (which they often aren't,) there is a large burden on the author
to fully understand the needs and concerns of the readers, to predict what
questions the reader will have, and to clearly communicate every
alternative interpretation of the observations. Most reports rarely meet
this standard and, therefore, fail the team miserably.
(The original post was about having an outside firm, being paid fixed
price, somehow write a report of an application they've never thought about
before for a team they've never met before. What are the chances this
effort will produce the quality the design team requires?)
When teams are distributed, there are still better ways (in my opinion) of
getting information to the team promptly and effectively. Several of our
clients are experimenting with remote testing techniques that allow the
team to observe regardless of their location. While we haven't tried this
ourselves, it's certainly an interesting approach. As bandwidth issues are
reduced, this kind of testing has potential.
We've been using post-test email discussions to relay the key observations
to the team, generating discussions about the various inferences that we
could draw. Since we do this after each session of testing (or, at a
minimum, at the end of a day of tests,) we get questions and
interpretations from the team members that encourage us to revise our
protocol. Testing becomes more responsive to the curiosity of the design
team, and becomes more of a two-way discussion than the traditional one-way
broadcast of the report.
The discussions often continue during the report writing process, again
with our goal of never putting anything in the report that the team hasn't
already heard and discussed. From the teams questions and thoughts, we can
tailor our emphasis to those areas that we think will best support the
teams short-term and long-term efforts.
I'd be interested to find out if other people have tried alternatives to
writing reports, especially for distributed teams? What success have you
had and what obstacles have you encountered?
Jared
Jared M. Spool User Interface Engineering
http://www.uie.com jspool at uie.com
Join us for UIE's extremely popular Roadshow event
UIE Advanced Techniques: http://www.uie.com/events/roadshow
More information about the Sigia-l
mailing list