[Sigia-l] Evaluating the evaluators

Peter Jones peter at redesignresearch.com
Mon Sep 4 13:55:26 EDT 2006




There are plenty of interesting responses to the question of who evaluates
the usability testers. This happens organizationally in many ways, such as:

In large, multi-consultant projects, independent usability evaluators
sometimes conduct parallel tests to those done by the lead contracting firm.
I have been hired to be the evaluator when a large, expensive design firm
(fox) also proposed to do their own testing (henhouse).

In cross-functional project teams, there are multi-functional reviews of
results. Severities and priorities assumed by usability research can be
renegotiated - sometimes a "critical severity" issue may be found on a
feature that the client didn't tell you was to be redesigned or demoted
anyway. Its wise not to push too hard - as independent consultants, and not
insiders, we don't always know what the plans are for what we're testing.

In product development, the product managers have the final say for a given
release. If they ignore usability, they will find out soon after the
release. By not being jerks about being ignored the first time, people
remember your demeanor and ask you back nicely to fix things.

Final validation tests on the redesigned prototype based on your usability
research. This can be also be done quickly, the idea is to honestly evaluate
your own revision before release.

I was going for 10 points, but could not make them funny in the time that I
have.

Cheers, Peter

Peter H. Jones, Ph.D.
REDESIGN RESEARCH       innovation insight
http://redesignresearch.com
Design / Redesign blog:
http://redesignresearch.com/reblog.htm



----------------------------------------
I am using the free version of SPAMfighter for private users. It has removed
18233 spam emails to date. Paying users do not have this message in their
emails. Get the free SPAMfighter here: http://www.spamfighter.com/len





More information about the Sigia-l mailing list