Effectively capturing the user experience Jenny Craven Research Associate, CERLIM j.craven@mmu.ac.uk.

Post on 14-Dec-2015

216 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

Transcript

Effectively capturing the user experience

Jenny CravenResearch Associate, CERLIM

j.craven@mmu.ac.uk

“…you sighted people just go click,click, click, and there’s the answer …. While

I’m still looking for the first ‘!@**!!’ link.

It’s very frustrating”

(Quote from 2003)

Are websites becoming more accessible?

• 81% of websites audited failed to meet minimum requirements (WCAG A) (DRC, 2004)

• Automated testing revealed that only a small number of websites (3%) met the WCAG accessibility level AA (City University, 2004).

• 3% of the 436 online websites assessed achieved the most basic level of WCAG (Cabinet Office, 2005)

• 75 percent of businesses in the FTSE 100 list of companies failed to meet the minimum requirements for website accessibility (Nomensa, 2006)

Are websites becoming more accessible?

• 81% of websites audited failed to meet minimum requirements (WCAG A) (DRC, 2004)

• Automated testing revealed that only a small number of websites (3%) met the WCAG accessibility level AA (City University, 2004).

• 3% of the 436 online websites assessed achieved the most basic level of WCAG (Cabinet Office, 2005)

• 75 percent of businesses in the FTSE 100 list of companies failed to meet the minimum requirements for website accessibility (Nomensa, 2006)……..What’s the solution?

Different approaches implementing and understanding

web accessibility

• Standards

• Guidelines

• User testing

• User profiles

• User models

Different approaches implementing and understanding

web accessibility

• Standards

• Guidelines

• User testing

• User profiles

• User models

Different approaches implementing and understanding

web accessibility

• Standards

• Guidelines

• User testing

• User profiles

• User models

User Testing:Key points to consider

• Objectives of the user testing

• Number and type of participants

• Time for recruiting participants

• Pilot testing

• Ethical issues

User Testing Methods

• Card sorting exercises

• Focus groups

• Online questionnaires

• Observation

• Semi-structured interviews

User Testing Methods

• Expert evaluation– Cognitive walkthrough– Heuristic evaluation

• Free searching/browsing• Task-based evaluation

– Observation– Think aloud (simultaneous and retrospective)– On-screen data capture– Pre- and post-task interviews

User Testing Methods

• Expert evaluation– Cognitive walkthrough– Heuristic evaluation

• Free searching/browsing• Task-based evaluation

– Observation– Think aloud (simultaneous and retrospective)– On-screen data capture– Pre- and post-task interviews

Task-based User Testing• Face-to-Face

– Pros: very rich data; avoids misunderstanding and misinterpretation - explains the why as well as the what and how;

– Cons: time consuming; recruitment difficulties; sample size is often small; a testing environment can have an impact

• Remote– Pros: enables a larger sample size; often easier to

recruit; participants can undertake testing using their own technology and at a time and place convenient to them

– Cons: lacks the richness of face-to-face; responses may be very brief; responses can be misinterpreted – may require follow-up interviews

Case Studies

• Case Study One: Non-visual Access to the Digital Library (NoVA)

• Case Study Two: European Internet Accessibility Observatory (EIAO)

Case Study One: To compare information seeking of visually impaired and sighted users

• 20 sighted, 20 visually impaired users• Four web-based resources• Face-to-Face task-based approach• Search process logged

– time, keystrokes, mouse clicks etc– Think aloud protocol– Pre- and post-task questions

• Aim: to inform the design of accessible websites and widen access to web-based resources

Analysing the data

• Observation data and On-screen data capture: keystrokes and mouse click comparisons, mapping the search and browsing process

• Think aloud: Comments and feelings while undertaking the task

• Pre- and post-task questions: Further insight into perceptions of the site and user experience whilst undertaking the task

Case Study Two: To identify and rank web accessibility barriers

• 25 users: visual, mobility, hearing, and cognitive disabilities

• 16 web-based resources; 2 iterations• Remote task-based approach• Pre- and Post Task questions• Ranking of accessibility; comments• Aim: to provide a richer picture of the user

experience when accessing and interacting with websites

Task based approach

Provision of a title for each frame

• Task Purpose: to test the accessibility of frames• Web page selected: the WCAG recommend providing a

title for each frame to facilitate frame identification and navigation. The web page which was tested did not conform to this recommendation

• Task: participants were asked to complete two tasks using a web page with two frames, firstly to find information displayed in the right-hand frame, then to find a link to contents displayed in the left-hand frame

• Evaluation: following the task, participants were asked to complete an online evaluation form

Analysing the data

• Ranked responses relating to the evaluation of the website tested– User friendly– Ease of use– Problems experienced

• Open comments field to expand on the ranked responses given

The Results

Results from both studies provided recommendations for:

• Web page design• Assistive technology• Staff training/User training• Universal design• Digital approaches• Further research

Reporting the Results

• Graphs

• Quotes

• Illustrations e.g. video recordings

• Scenarios/Vignettes

• User models

Reporting the Results

• Graphs

• Quotes

• Illustrations e.g. video recordings

• Scenarios/Vignettes

• User models

User Models

• Dervin’s ‘sense making approach’ (Dillon and Watson, 1996)

• Kuhlthau’s model of the information search process (Kuhlthau, 1993)

• Ellis’ Model of Information Seeking (Wilson, 2000)

• Search Process Model (SPM) developed by Logan and Driscoll-Eagan (1998)

• Barrier Walkthrough Method (Brajnik, 2006)

User Models

• Dervin’s ‘sense making approach’ (Dillon and Watson, 1996)

• Kuhlthau’s model of the information search process (Kuhlthau, 1993)

• Ellis’ Model of Information Seeking (Wilson, 2000)

• Search Process Model (SPM) developed by Logan and Driscoll-Eagan (1998)

• Barrier Walkthrough Method (Brajnik, 2006)

Barrier Walkthrough MethodFeature Description

Barrier Users cannot perceive nor understand information conveyed by an image

Defect An image that does not have accompanying text

Users affected Blind users of screen readers, users of small devices

Consequences Users try to look around for more explanations, spend time and effort, satisfaction is affected.

Barrier Walkthrough Visually Impaired Users

Barrier Users had difficulty in identifying where the information required for the task was located.

Defect Page used two frames, but had not applied a title to each frame to facilitate frame identification and navigation

Users affected People who are blind using screen readers.

Consequences Users had to keep swapping back and forth between frames to try and decipher where the information they were looking for was located.

Conclusions

• User testing helps identify accessibility and usability issues experienced - beyond technical guidelines and checkpoints.

• User models provide clear illustrations of user behaviour and accessibility issues.

• Greater awareness and understanding of the need to consider a more flexible, pragmatic and holistic approach to the design of websites.

Thank you!

Any questions?

j.craven@mmu.ac.uk

top related