Top Banner
Evaluation of eLearning Michael M. Grant, PhD Michael M. Grant 2010
44

Evaluation of eLearning

Jan 27, 2015

Download

Education

Michael M Grant

Moving beyond level 1 and level 2 evaluations. Using usability methods to improve elearning.
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Evaluation of eLearning

Evaluation of eLearningMichael M. Grant, PhD

Michael M. Grant 2010

Page 2: Evaluation of eLearning
Page 3: Evaluation of eLearning

Kirkpatrick’s Levels

Level 5:ROI

the investment of the training comparedto its relative benefits to the organizationand/or productivity/revenue

91.3%

53.9%

22.9%

7.6%

2.1%

(ASTD, 2005)

Page 4: Evaluation of eLearning

Kirkpatrick (& Phillips) Model

92%

17.9%

(ASTD, 2009)

Page 5: Evaluation of eLearning

FORMATIVE EVALUATIONWhat’s the purpose?

Page 6: Evaluation of eLearning

A focus on improvement during development.

Page 7: Evaluation of eLearning

Level 2 Evaluations

Appeal

Effectiveness

Efficiency

Page 8: Evaluation of eLearning

Data Collection Matrix

Methods

1. What are the logistical requirements?

2. What are user reactions?

3. What are trainer reactions?

4. What are expert reactions?

5. What corrections must be made?

6. What enhancements can be made?

Anecdotal records X X X X X

User questionnaires X X X X

User interviews X X X X

User focus groups X X X

Usability observations X X X X

Online data collection X X

Expert reviews X X X

Page 9: Evaluation of eLearning

“Vote early and often.”

The sooner formative evaluation is conducted during development, the more likely that substantive improvements will be made and costly errors avoided. (Reeves & Hedberg, 2003, p. 142)

Page 10: Evaluation of eLearning
Page 11: Evaluation of eLearning

“Experts are anyone with specialized knowledge that is relevant to the design of your ILE.”

(Reeves & Hedberg, 2003, p. 145)

Page 12: Evaluation of eLearning

Expert Review

Page 13: Evaluation of eLearning

Interface Review Guidelines

from http://it.coe.uga.edu/~treeves/edit8350/UIRF.html

Page 14: Evaluation of eLearning

USER REVIEWObservations from one-on-ones and small groups

Page 15: Evaluation of eLearning

What Is Usability?

Page 16: Evaluation of eLearning

The most common user action on a Web site is to flee.”

— Edward Tufte

Page 17: Evaluation of eLearning

“at least 90% of all commercial Web sites are overly difficult to use….the average outcome of Web usability studies is that test users fail when they try to perform a test task on the Web. Thus, when you try something new on the Web, the expected outcome is failure.”

— Jakob Nielsen

Page 18: Evaluation of eLearning

Nielsen’s Web Usability Rules

1. Visibility of system status

2. Match between system and real world

3. User control and freedom

4. Consistency and standards

5. Error prevention6. Recognition rather

than recall

7. Flexibility and efficiency of use

8. Help users recognize, diagnose, and recover from errors

9. Help and documentation

10. Aesthetic and minimalist design

Page 19: Evaluation of eLearning

Ease of learning - How fast can a user who has never seen the user interface before learn it sufficiently well to accomplish basic tasks?

Efficiency of use - Once an experienced user has learned to use the system, how fast can he or she accomplish tasks?

Memorability - If a user has used the system before, can he or she remember enough to use it effectively the next time or does the user have to start over again learning everything?

Error frequency and severity - How often do users make errors while using the system, how serious are these errors, and how do users recover from these errors?

Subjective satisfaction - How much does the user like using the system?

Page 20: Evaluation of eLearning

Two Major Methods to Evaluate Usability

Page 21: Evaluation of eLearning

Heuristic Evaluation Process

1. Several experts individually compare a product to a set of usability heuristics

2. Violations of the heuristics are evaluated for their severity and extent suggested solutions

3. At a group meeting, violation reports are categorized and assigned

4. average severity ratings, extents, heuristics violated, description of opportunity for improvement

Page 22: Evaluation of eLearning

Heuristic Evaluation Comparisons

AdvantagesQuick: Do not need to find or schedule users

Easy to review problem areas many times

Inexpensive: No fancy equipment

DisadvantagesValidity: No users involved

Finds fewer problems (40-60% less??)

Getting good expertsBuilding consensus with experts

Page 23: Evaluation of eLearning

Heuristic Evaluation Report

Page 24: Evaluation of eLearning

Heuristic Evaluation Report

Page 25: Evaluation of eLearning

USER TESTING

Page 26: Evaluation of eLearning

User Testing

People whose characteristics (or profiles) match those of the Web site’s target audience perform a sequence of typical tasks using the site.

Examines:– Ease of learning

– Speed of task performance

– Error rates

– User satisfaction

– User retention over time

Page 27: Evaluation of eLearning

Image from (nz)dave at http://www.flickr.com/photos/nzdave/491411546/

Page 28: Evaluation of eLearning

Elements of User Testing

Define target users

Have users perform representative tasks

Observe users

Report results

Page 29: Evaluation of eLearning

Why Multiple Evaluators?

Single evaluator achieves poor results

– Only finds about 35% of usability problems

– 5 evaluators find more than 75%

Page 30: Evaluation of eLearning

Why only 5 Users?

(Nielsen, 2000)

Page 31: Evaluation of eLearning

Reporting User Testing

Overall goals/objectives

Methodology

Target profile

Testing outline with test script

Specific task list to perform

Data analysis & results

Recommendations

Page 32: Evaluation of eLearning

RECENT METHODS FOR USER TESTING

Page 33: Evaluation of eLearning
Page 34: Evaluation of eLearning
Page 35: Evaluation of eLearning
Page 36: Evaluation of eLearning
Page 37: Evaluation of eLearning
Page 39: Evaluation of eLearning
Page 41: Evaluation of eLearning

10 Second Usability Test

1. Disable stylesheets

2. Check for the following:

1. Semantic markup

2. Logical organization

3. Only images related to content appear

Page 42: Evaluation of eLearning

ALPHA, BETA & FIELD TESTING

Akin to prototyping

Page 43: Evaluation of eLearning

References & Acknolwedgements

American Society for Training & Development. (2009). The value of evaluation: Making training evaluations more effective. Author.

Follett, A. (2009, October 9). 10 qualitative tools to improve your web site. Instant Shift. Retrieved March 18, 2010 from http://www.instantshift.com/2009/10/08/10-qualitative-tools-to-improve-your-website/

Nielsen, J. (2000, March 19). Why you only need to test with 5 users. Jakob Nielsen’s Alertbox. Retrieved from http://www.useit.com/alertbox/20000319.html

Reeves, T.C. (2004, December 9). Design research for advancing the integration of digital technologies into teaching and learning: Developing and evaluating educational interventions. Paper presented to the Columbia Center for New Media Teaching and Learning, New York, NY. Available at http://ccnmtl.columbia.edu/seminars/reeves/CCNMTLFormative.ppt

Reeves, T.C. & Hedberg, J.C. (2003). Interactive learning systems evaluation. Englewood Cliffs, NJ: Educational Technology Publications.