Does evaluation really add value? Janet M. Clinton Centre for Program Evaluation Canberra Evaluation Forum May 2012
Does evaluation really
add value?
Janet M. Clinton
Centre for Program Evaluation
Canberra Evaluation Forum
May 2012
Changes the world
What do you feel like when you have
to engage in evaluation?
The plan
The evidence is confusing
Do we add value?
• Understand the merit &
worth of Evaluation
• Review the research
• Implications for practice
• Innovative
measurement
• Look to a way forward
The current evidence
• Changes organisations - Williams
• Develops learning environment - Preskill
• Evaluation willingness and capacity leads to increased
program success - Clinton
• Influences individual, interpersonal & the collective- Marks
& Henry
• Empowers –Fetterman
• Evaluation attributes impact on organisations-Appleton
• Systematic review of ECB demonstrates outcomes -
Labin, Meyer, & Wandersman
Our impact
KNOW THY IMPACT!
Click to edit Master title style
This study
The programs & the evaluation
5yr evaluation of a district-wide plan in NZ, aimed at mobilizing a community to prevent the onset of Type II Diabetes.
*10 action areas: Over150 different initiatives over 4 year period
Nelson Marlborough Nutrition & Physical Activity:
Collaborate as a community to reverse the tide of obesity.
*5 Action Areas: Over 100 different initiatives over 3 year period
EVALUATION METHODS: An adapted form of the CDC Evaluation Framework Standard
dimensions
Multiple forms of data collection: reports, surveys, case studies,
interviews, observations, and internal data
Key information from 292 initiatives coded using scoring rubrics
A research lens & the evaluation
THIS STUDY:
Analysed year by year
and over 4 years.
Determine relationships:
Factor analysis, Path
models
• Cleaned the data
• Determined the strength & validity of the dimensions
• Identified outcomes
• Looked for causal relationships
MEASURES
Program
•KPI
•Adaptation
•Degree of implementation
Outcome
•Progress
• Sustainability
Process
•Organisational development
•Collaboration
•Evaluation Engagement
STA
ND
AR
D S
ET
TIN
G
SCORE EXEMPLAR
KPI 1-3 Only 30% achieved.
Only ‘easy’ KPIs
indicator achieved
4-7 Achieved 40% mix
of ‘hard and easy’
indicators
8-10 100% achieved
mixture of levels of
difficulty
THE MODEL
So what does this mean?
SUSTAINABILITY:
KPI XXX
ADAPTATION --IVE X
DOIX
ODX
COLLABORATION XXX
EVALUATION XXX
PROGRESS GOAL:
KPI XXX
ADAPTATION X
DOI XXX
OD XXX
COLLABORATION –IVE XXX
EVALUATION XXX
The outcome variables
SUSTAINABILITY-
WEIGHTING VARIES ACCORDING TO:
TIME: DOI, COLLABORATION, KPIs
PROGRAM: OD, COLLABORATION, EVALUATION
PROGRESS-
WEIGHTING VARIES ACCORDING TO:
TIME: ADAPTATION, OD, DOI, KPIs
PROGRAM: OD, COLLABORATION, EVALUATION
What do you think?
• Chat to your partner
about the model.
• How does this fit with
reality?
Implications
Measure
•Degree of Implementation
•Evaluation engagement
Take into account
•The nature of KPI
•Process components
Important constructs
•Adaptation is healthy
•Collaborative approaches
EVALUATION IS AN EQUATION
Program
Outcome
Impleme
ntation
Process
Dosage
Intervention
Adaptation
Evaluation
influence
CONTEXT
An evaluation equation
Work it
out
Evaluation Practice
Evaluation does have an impact in
your practice
Evaluation as a discipline is evolving
into a vehicle of social change
Whether process or product
evaluation you have an impact
Changes the way you practice
Judgment is still critical
Systematically measure influence &
impact
Question your role
Different types of evaluation have
differing effects
Final word
EVALUATION IS AN INTERVENTION
EVALUATION ADDS VALUE
EVALUATORS - KNOW THY IMPACT
Thank you