Actionable Information: Tools and Techniques to Help Design Effective Intranets Frank Cervone Assistant University Librarian for Information Technology Northwestern University Evanston, IL, USA Darlene Fichter Data Librarian University of Saskatchewan Saskatoon, SK, Canada
30
Embed
Actionable Information: Tools and Techniques to Help Design Effective Intranets Frank Cervone Assistant University Librarian for Information Technology.
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Actionable Information:Tools and Techniques to Help
Design Effective Intranets
Frank CervoneAssistant University Librarian for Information TechnologyNorthwestern UniversityEvanston, IL, USA
Darlene FichterData Librarian
University of SaskatchewanSaskatoon, SK, Canada
Overview
• Why heuristics testing? • What is heuristics testing?• Heuristics applied to the web• Using heuristic testing for your
intranet
Why?• Will find 81%-90% of usability problems1
– Evaluators are experts in software ergonomy and in the field in which the software is applied.
• 22% to 29% of usability problems1
– Evaluators know nothing about usability
• Single evaluators found only 35 percent2
1) Jakob Nielsen, Finding usability problems through heuristic evaluation. In: Proceedings of the ACM CHI '92 (3.-7. May 1992), pp. 373-380.
2) Jakob Nielsen, http://www.useit.com/papers/heuristic/heuristic_evaluation.html
Heuristic evaluation? • What
– A usability inspection method– One or more expert evaluators systematically
inspect a user interface design– Judge its compliance with recognized usability
principles
• When– At any point in the design process
• Who– More is better– Best results with at least 3-5 evaluators– 1 is better than none!
Yes, more is better
Courtesy of useit.comhttp://www.useit.com/papers/heuristic/heuristic_evaluation.html
How?• Evaluators review interface individually
– Report problems to coordinator– Assign severity ratings
‘Borrower Services’ not found as a main page heading
Menu options are the search links for the Song Index
No side navigation menu offered.
Labels, Language and Ambiguity
•Overlap.•Mismatch between heading and items •Vague headings
Audience specific areas are scattered
Structured evaluation
• Develop a list of specific questions related to the issues at hand– Tie back to heuristic principles
• Provides greater direction of problem-solving energy
• More appropriate when relying on “subject” experts
Sample questions at Northwestern
1. Did you feel that you were able to tell what was going on with the system while you were working?
2. Did the language on the site make sense to you? Were you able to understand what the pages were trying to communicate?
3. Did you notice inconsistencies in the way things were referred to?
4. Were you able to navigate and use the site without having to refer back to other pages for needed information?
Feedback - what people said• Question: Did the language on the site
make sense to you? Were you able to understand what the pages were trying to communicate?
– No, sentences are too long. Use numbers to mark each choice
– Some of the language seemed a bit like "library-ese," i.e. terms like "descriptor," etc
– Most of the language makes sense, in the sense that it is not jargon (except for "NUcat"), but as I said the contents of the categories are not always clear
Long sentences
Jargon
Interesting observations
• There are too many choices which are hard to distinguish between
• It seems like the info organization probably reflects the internal structures of the library more than the user's point of view
• I generally felt lost on the site. It was unclear where I needed to go to actually find anything I needed
• Too much information on one page
? ??
???
How is heuristic evaluation relevant to usability testing?
• Allow us to fix big problems before user testing
• Provides a clue to problem areas– Can be basis for determining usability
questions
How is this different from usability testing?
• Analyzing the user interface is the responsibility of the evaluator
• Observer can answer questions from the evaluators during the session
• Evaluators can be provided with hints on using the interface