Top Banner
Learning-Based Evaluation of Visual Analytics Systems Remco Chang , Caroline Ziemkiewicz, Roman Pyzh, Joseph Kielman * , William Ribarsky UNC Charlotte Charlotte Visualization Center *Department of Homeland Security
16

Learning-Based Evaluation of Visual Analytic Systems.

Jan 13, 2015

Download

Documents

BELIV Workshop

Presenter: Remco Chang
BELIV 2010 Workshop
http://www.beliv.org/beliv2010/
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Learning-Based Evaluation of Visual Analytic Systems.

Learning-Based Evaluation of Visual Analytics Systems

Remco Chang, Caroline Ziemkiewicz, Roman Pyzh, Joseph Kielman*, William Ribarsky

UNC CharlotteCharlotte Visualization Center

*Department of Homeland Security

Page 2: Learning-Based Evaluation of Visual Analytic Systems.

Why Another Evaluation Method?

• Based on a discussion with Joe Kielman (DHS)– Why is it difficult for agencies like the DHS to adopt and

use visual analytics systems?

• Most existing metrics are not indicative of success of adoption– Task completion time– Errors– Subjective preferences– Etc.

Page 3: Learning-Based Evaluation of Visual Analytic Systems.

Current Methods

• Methods for evaluating visual analytics systems have been proposed. Each has its unique perspective and goal. For example:

– Insight-based Evaluation (North et al.)– Productivity-based Evaluation (Scholtz)– MILC -- Multi-dimensional in-depth long-term case

studies (Schneiderman, Plaisant)– Grounded Evaluation (Isenberg et al.)

Page 4: Learning-Based Evaluation of Visual Analytic Systems.

Our Goal for Evaluation

• What Joe wants is:– Proof that the user of the visual analytics system can gain

proficiency in solving a problem using the system

– By using the VA system, show that a user can gradually change from being a “novice” to becoming an “expert”

• In other words, Joe wants proof that by using the VA system, the user is gaining knowledge…– The goal of visualization is to gain insight and knowledge

(ViSC report, 1987) (Illuminating the Path)

Page 5: Learning-Based Evaluation of Visual Analytic Systems.

Learning-Based Evaluation

• In light of this goal, we propose a “learning-based evaluation” that attempts to directly test the amount of knowledge gained by its user.

• The idea is try to determine how much the user has learned after spending time using a VA system by:– Giving a user a similar but different task.– Directly testing if the user has gained proficiency in the

subject matter.

Page 6: Learning-Based Evaluation of Visual Analytic Systems.

Current Method

Page 7: Learning-Based Evaluation of Visual Analytic Systems.

Our Proposed Method

Page 8: Learning-Based Evaluation of Visual Analytic Systems.

Types of Learning

• In designing either a new task or the questionnaire, it is important to differentiate and isolate what is being tested:

– Knowledge gained about the Interface– Knowledge gained about the data– Knowledge gained about the task (domain)

Page 9: Learning-Based Evaluation of Visual Analytic Systems.

iPCA Example

• iPCA stands for “interactive Principle Component Analysis”. By using it, the user can learn about:– The interface – The dataset

• relationships within the data

– The task• What is principle

component analysis, and• How can I use principle

component analysis to solve other problems?

Page 10: Learning-Based Evaluation of Visual Analytic Systems.

Application to the VAST Challenge

• Current method: – Give participants a dataset and a problem– Ask participants to develop VA systems to solve

the problem– Ask participants to describe their systems and

analytical methods– Judges score each submission based on the

developed systems and their applicability to the problem

Page 11: Learning-Based Evaluation of Visual Analytic Systems.

Application to the VAST Challenge

• Proposed method: – Give participants a dataset and a problem– Ask participants to develop VA system to solve the

problem– Ask participants to bring their systems to VisWeek– Give participants a similar, but different dataset and

problem– Ask participants to solve the new problem using their

VA systems– Judges score each participant based on the

effectiveness of each system in solving the new task.

Page 12: Learning-Based Evaluation of Visual Analytic Systems.

Types of Learning

• In designing either a new task or the questionnaire, it is important to differentiate and isolate what is being tested:

– Knowledge gained about the Interface– Knowledge gained about the data– Knowledge gained about the task (domain)

Page 13: Learning-Based Evaluation of Visual Analytic Systems.

Discussion/Conclusion

• This learning-based method seems simple and obvious because it really is. Teachers have been doing this for ages.

• The method is not unique. There are many aspects of this proposed method that are similar to existing methods. In spirit, we are all looking to address the same problem.

• The difference is the perspective. If we think about the problem from the perspective of a client (e.g., Joe at DHS), what they look for in evaluation results currently are not the same as what we as researchers give them.

Page 14: Learning-Based Evaluation of Visual Analytic Systems.

Future Work

• Integrate the proposed learning-based method to:– Grounded Evaluation– Long term effects (MILC)

Page 15: Learning-Based Evaluation of Visual Analytic Systems.

Thank you!

[email protected]://www.viscenter.uncc.edu/~rchang

Page 16: Learning-Based Evaluation of Visual Analytic Systems.

The Classroom Analogy

• Say you’re a math teacher in middle school, and you’re trying to decide which text book to use, the blue one or the red one. You can:– Ask your friends which book is better

• Analogous to an “expert-based evaluation”. Problem is that the sample size is typically small, and the results difficult to replicate.

– Ask your students which book they like• Analogous to subjective preferences. The issue here is that

the students can prefer the blue text book because its blue.

– Test which text book is more effective by giving the students tests.