[Sigia-l] Usability Testing comments from Giga
Lyle_Kantrovich at cargill.com
Lyle_Kantrovich at cargill.com
Thu Mar 27 18:37:43 EST 2003
We happen to have a Giga subscription and a co-worker asked me what I
thought of this Giga 'ideabyte' titled "Web Site Usability Testing vs.
Best Practices."
I didn't feel they were talking just about guidelines and expert
reviews. They mentioned 'best practices' a lot which is a much more
slippery term.
I generally agreed with the main points, although I fear the article
might be misread as "usability testing is expensive and follow 'best
practice' to avoid that expense."
What I think the Giga analyst is saying is people should follow
standards and 'best practices' (And the definitive source for those
would be where?) so your tests don't have to tell you things you should
have already known. Based on reading other Giga pieces, the author
generally is an advocate of UCD and U-Tests, but the title and some of
the wording makes that hard to distinguish. It shouldn't be titled "Web
Site Usability Testing vs. Best Practices", but rather "Use Best
Practices to Minimize Problems Found in Usability Testing."
He also seems to assume U-tests are done after development which is the
most expensive time to do testing. Low-fi prototype U-tests cost much
less. He also ignores the fact that 'best practices' are not always
that easy to identify and misapplication of a 'norm' can be, and often
is, the cause of poor usability.
Tests are valuable for more reasons than just testing the product.
Observing usability tests is one of the few ways designers and
developers can actually get experience with their target audiences.
Better yet, that experience is framed around the customer's experience
using a product. It's the best training you could ever give your
designers. Continuous training is also a 'best practice.' Where do
'good IAs' or 'good designers' come from? How do they build their
expertise? Testing can help you seriously speed up the learning
process.
Do I need to conduct a test to learn for the 100th time that some users
have problems with popups? No. If I used popups in a new way, might I
need to test with MY users and THEIR tasks to see if the popups are
effective? Yes. Are there other ways of getting that feedback? Sure,
but most of them are higher cost and much slower (e.g. help desk calls).
Someone put up the red herring that testing slows down time to market.
Time to market doesn't mean jack if you're the first to market with a
crappy product no one wants to use. For every example of successful
'first movers', there's lots of examples of 'faster, smarter followers'
who killed their competition. Did we learn nothing from the dot com
insanity? In 'those days', I saw lots of 'first movers' moving the
wrong direction. A few weeks of early testing might have saved some of
them their existences. Testing helps you get to better products and
more revenue faster. One of my clients just expressed that concept to
me by telling me a new application we had designed (over a few months)
would be far ahead of another similar application that they have been
'iterating' for two and a half years. We showed them a faster way to
iterate: User-Centered Design.
Regards,
Lyle
----
Lyle Kantrovich
User Experience Architect
Cargill
http://www.cargill.com
Croc O' Lyle: a personal web log on usability, Information Architecture,
and web design
http://crocolyle.blogspot.com
"Simplicity is the ultimate sophistication."
- Leonardo da Vinci
-----Original Message-----
From: joe at sokohl.com [mailto:joe at sokohl.com]
Sent: Tuesday, March 25, 2003 7:38 AM
To: sigia-l at asis.org
Subject: [Sigia-l] Usability Testing comments from Giga
Hi all,
Welcome back to those sitffs lucky enough to have been in Portland for
the IA Summit. You suck! (just kidding, of course. Quite jealous,
actually). I know some of you have or do engage in usability
testing...either formative or summative or a combination. You might be
interested in a recent article from Giga Research. They recently opined
on the issue. In short, they feel heuristic/expert review and guideline
following are to be followed, rather than the collection of observed
data--here's the summary:
==========
Results from usability testing done by Gigas Web Site Effectiveness
Team, as well as usability tests from other sources, show that the
majority of Web site issues uncovered also would have been identified
using a comprehensive normative screening process. In these cases, the
usability testing was a more expensive way to confirm what was already
known and for a much more limited portion of the Web site.
Usability testing is most useful after normative testing has uncovered
the deviations, it can:
1. Provide direct customer data to support the process of gaining
internal agreement for making specific changes to the site
2. Check whether specific norms apply to a specific audience
3. Test key visitor goal scenarios for unanticipated (unknown)
barriers
By placing usability testing in the context of understanding or
augmenting normative results, the scope (and cost) can be limited by
using the normative knowledge to target the tests and interpret the
results.
The situation to avoid is an expensive usability test to discover what
is already known. This is why it is important to understand where
normative deviations exist first and to fix them (at least where they
will be encountered in the usability testing scenarios) before
conducting usability tests.
========
Thoughts?
joe
------------
When replying, please *trim your post* as much as possible.
*Plain text, please; NO Attachments
ASIST IA 03 Summit: Making Connections
http://www.asist-events.org/IASummit2003/
Searchable list archive: http://www.info-arch.org/lists/sigia-l/
________________________________________
Sigia-l mailing list -- post to: Sigia-l at asis.org
Changes to subscription: http://mail.asis.org/mailman/listinfo/sigia-l
-------------- next part --------------
A non-text attachment was scrubbed...
Name: BDY.RTF
Type: application/rtf
Size: 6490 bytes
Desc: not available
Url : http://mail.asis.org/mailman/private/sigia-l/attachments/20030327/10c00289/attachment.rtf
More information about the Sigia-l
mailing list