Actionable Information: Tools and Techniques to Help Design Effective Intranets Frank Cervone Assistant University Librarian for Information Technology.

Post on 31-Mar-2015

217 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

Transcript

Actionable Information:Tools and Techniques to Help

Design Effective Intranets

Frank CervoneAssistant University Librarian for Information TechnologyNorthwestern UniversityEvanston, IL, USA

Darlene FichterData Librarian

University of SaskatchewanSaskatoon, SK, Canada                   

                                                            

Overview

• Why heuristics testing? • What is heuristics testing?• Heuristics applied to the web• Using heuristic testing for your

intranet

Why?• Will find 81%-90% of usability problems1

– Evaluators are experts in software ergonomy and in the field in which the software is applied.

• 22% to 29% of usability problems1

– Evaluators know nothing about usability

• Single evaluators found only 35 percent2

1) Jakob Nielsen, Finding usability problems through heuristic evaluation. In: Proceedings of the ACM CHI '92 (3.-7. May 1992), pp. 373-380.

2) Jakob Nielsen, http://www.useit.com/papers/heuristic/heuristic_evaluation.html

Heuristic evaluation? • What

– A usability inspection method– One or more expert evaluators systematically

inspect a user interface design– Judge its compliance with recognized usability

principles

• When– At any point in the design process

• Who– More is better– Best results with at least 3-5 evaluators– 1 is better than none!

Yes, more is better

Courtesy of useit.comhttp://www.useit.com/papers/heuristic/heuristic_evaluation.html

How?• Evaluators review interface individually

– Report problems to coordinator– Assign severity ratings

• Coordinator combines problems– Removes duplicates

• Evaluators review combined list – Optional - assign severity ratings as a group

• Coordinator averages ratings– Ranks problems by severity

• Web team looks for patterns and find solutions

Why?• Good method for finding both major and

minor problems in a user interface– Finds major problems quickly– Will tend to be dominated numerically by the

minor problems– So, it is important to rate errors and rank

them

• Compliments user testing– Not a replacement for it– Used to find different types of errors

• Things an “expert” user would notice

Rating errors• Frequency

– Is it common or rare

• Impact– Easy or difficult for the users to overcome?

• Persistence– A one-time problem?

• Users can overcome once they know about it

– Repeatedly be bothered by the problem?

• Market impact– Certain usability problems can have a

devastating effect, even if they are quite easy to overcome

Rating scale0 = Not a problem

I don't agree that this is a usability problem at all

1 = Cosmetic problem only Need not be fixed unless extra time is available

2 = Minor usability problem Fix should be given low priority

3 = Major usability problem Important, so should be given high priority

4 = Usability catastrophe Imperative to fix this before release

Heuristics (1-5)

1) Visibility of system status2) Match between system and the

real world 3) User control and freedom4) Consistency and standards5) Error prevention

Heuristics (6-10)

6) Recognition rather than recall7) Flexibility and efficiency of use8) Aesthetic and minimalist design9) Error recovery10)Help and documentation

That’s great, but…

• So, how do you apply this in the real world?

• Several possibilitiesUnstructured evaluationStructured evaluation

Unstructured evaluation

• Let the experts find the problems as they occur

• Provides greater “free-form” discovery of problems

• More appropriate when working with usability experts

Edmonton Public Library site

• 3 evaluators reviewed the site• 2 passes through the site• 1 ½ to 2 hours

Report summary

• More than 100 unique violations• Over 60 violations for “consistency

and standards”• Another frequently violated

heuristic being the “match between the system and the real world” – due to poor labels, jargon and

ordering of items

Frequency by heuristic

Frequency of Reports by Heuristic

0 10 20 30 40 50 60 70

Help and Documentation

Recover from Errors

Aesthetic and Minimalist Design

Flexibility and Eff iciency of use

Recognition rather than Recall

Error Prevention

Consistency and Standards

User Control and Freedom

Match Betw een System and Real World

Visibility of System Status

Problems by area

Microcontent Hierarchy Visual Design Side Menu Navigation Language Search Total1 2 0 11 0 5 3 0 212 13 11 20 4 14 3 4 693 11 9 7 5 12 9 2 554 0 0 0 3 1 0 0 4

Total 26 20 38 12 32 15 6 149

Area

Se

veri

ty

Link Colors

Main links in purple

Smaller links in blue

Other areas had different link colors altogether

Different Menus

No ‘Home’ button

‘Borrower Services’ not found as a main page heading

Menu options are the search links for the Song Index

No side navigation menu offered.

Labels, Language and Ambiguity

•Overlap.•Mismatch between heading and items •Vague headings

Audience specific areas are scattered

Structured evaluation

• Develop a list of specific questions related to the issues at hand– Tie back to heuristic principles

• Provides greater direction of problem-solving energy

• More appropriate when relying on “subject” experts

Sample questions at Northwestern

1. Did you feel that you were able to tell what was going on with the system while you were working?

2. Did the language on the site make sense to you? Were you able to understand what the pages were trying to communicate?

3. Did you notice inconsistencies in the way things were referred to?

4. Were you able to navigate and use the site without having to refer back to other pages for needed information?

Feedback - what people said• Question: Did the language on the site

make sense to you? Were you able to understand what the pages were trying to communicate?

– No, sentences are too long. Use numbers to mark each choice

– Some of the language seemed a bit like "library-ese," i.e. terms like "descriptor," etc

– Most of the language makes sense, in the sense that it is not jargon (except for "NUcat"), but as I said the contents of the categories are not always clear

Long sentences

Jargon

Interesting observations

• There are too many choices which are hard to distinguish between

• It seems like the info organization probably reflects the internal structures of the library more than the user's point of view

• I generally felt lost on the site. It was unclear where I needed to go to actually find anything I needed

• Too much information on one page

? ??

???

How is heuristic evaluation relevant to usability testing?

• Allow us to fix big problems before user testing

• Provides a clue to problem areas– Can be basis for determining usability

questions

How is this different from usability testing?

• Analyzing the user interface is the responsibility of the evaluator

• Observer can answer questions from the evaluators during the session

• Evaluators can be provided with hints on using the interface

Other types of evaluation techniques

• Heuristic evaluation• Heuristic estimation• Cognitive walkthrough • Pluralistic walkthrough • Feature inspection • Consistency inspection • Standards inspection • Formal usability inspection

Questions?

Frank CervoneAssistant University Librarian for Information TechnologyNorthwestern UniversityEvanston, IL, USAf-cervone@northwestern.edu

Darlene FichterData Librarian

University of Saskatchewan

Saskatoon, SK, Canadadarlene.fichter@usask.ca

                                                            

top related