Integrating Evaluation into the Design of Your Innovative Program Evaluation Support Division National Center for Environmental Innovation Office of Policy, Economics and Innovation US Environmental Protection Agency Innovation Symposium Chapel Hill, NC Thursday, January 10, 2008
62
Embed
Integrating Evaluation into the Design of Your Innovative Program Evaluation Support Division National Center for Environmental Innovation Office of Policy,
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Integrating Evaluation into the Design of Your
Innovative Program
Evaluation Support DivisionNational Center for Environmental InnovationOffice of Policy, Economics and InnovationUS Environmental Protection Agency
Innovation Symposium Chapel Hill, NCThursday, January 10, 2008
2
Workshop Outline
1. Introductions
2. Activity – Evaluation in Our Lives
3. Evaluation and its Evolution at EPA
4. Case Study – Product Stewardship in MN
5. Exercise – Integrating Evaluation in MN
6. Opportunities to Integrate Evaluation
3
Introductions
This will be an interactive workshop… so let’s interact!
• Get to know someone at your table• Tell us
• Who they are, • Who they work with, and • Their New Year’s resolution
4
Purpose of the Workshop
Through discussion and a practical, real-world example, provide participants with the structure and conceptual understanding necessary to integrate evaluation and performance management into the design of environmental programs.
5
Evaluation In Our Lives
Activity
• Name something in your life that you or someone else decided was worth measuring and evaluating.
• What was the context?
• Was there a target or goal…what was it?
• Who was the audience?
• How did you measure progress or success?
• How did you use what you learned?
6
Evaluation In Our Programs
What can we take from evaluation in our lives and apply to addressing environmental challenges?
• Measure what matters
• Evaluate for others and for ourselves
Integrating evaluation into program design
• Equal parts art and skill
• Performance management and quality evaluation are inseparable
7
Evaluation In The EPA
Evaluation Support Division
ESD’s Mission
• Evaluate innovations
• Build EPA’s capacity to evaluate
Performance Management
• An approach to accomplishing EPA goals and ESD’s mission
8
Performance Management
PERFORMANCE MANAGEMENTPerformance management includes activities to ensure that goals are consistently being met in an effective and efficient manner. Performance management tools include logic models, performance measurement and program evaluation.
Logic Model
Tool/framework that helps identify the program/project
resources, activities, outputs customers, and
outcomes.
Performance Measurement
Helps you understand what
level of performance is achieved by the program/project.
VictoryCommitment TrainingSnodgrass Juggling Regimen Me
12
Performance Measurement
Definition
• The ongoing monitoring and reporting of program progress and accomplishments, using pre-selected performance measures
Measures are designed to check the assumptions illustrated in the logic model
13
Measures Across the Logic Model SpectrumElement Definition Example Measure
Resources/ Inputs
Measure of resources consumed by the organization.
Amount of funds, # of FTE, materials, equipment, supplies (etc.).
Activities Measure of work performed that directly produces the core products and services.
# of training classes offered as designed; Hours of technical assistance training for staff.
Outputs Measure of products and services provided as a direct result of program activities.
# of technical assistance requests responded to; # of compliance workbooks developed/delivered.
Customer Reached
Measure of target population receiving outputs.
% of target population trained; # of target population receiving technical assistance.
Customer Satisfaction
Measure of satisfaction with outputs. % of customers dissatisfied with training; % of customers “very satisfied” with assistance received.
Outcomes Accomplishment of program goals and objectives (short-term and intermediate outcomes, long-term outcomes--impacts).
% increase in industry’s understanding of regulatory recycling exclusion; # of sectors that adopt regulatory recycling exclusion; % increase in materials recycled.
14
Program Evaluation
Definition
• A systematic study that uses measurement and analysis to answer specific questions about how well a program is working to achieve its outcomes and why.
Orientation/Approaches to Evaluation
• Accountability External Audience
• Learning & Program ImprovementInternal/External Audiences
15
Types of Evaluation
Process Evaluation
OutcomeEvaluation
ImpactEvaluation
Longer term
outcome (STRATEGIC
AIM)
Intermediate outcome
Short term outcome
CustomersOutputsActivitiesResources/Inputs
WHYHOW
Design Evaluation
16
Questions, Comments and Clarifications
Are there any questions or comments about what we have covered so far?
17
Environmental Evaluation: Evolving Theory and Practice
ESD is witnessing the shift from awareness to action
We are adapting to the increasing sophistication of our clients and demands from stakeholders
• Capacity Building
• Evaluations
Managing performance requires integrating evaluation into program design
18
Our Case Study
Our case study is representative of a trend toward more sophisticated evaluations of environmental programs
ESD is applying learning and adding to it as we take on more sophisticated projects
From here on, you are receiving information necessary to complete the exercises
• You are responsible for integrating evaluation into the program
• Ask questions and take notes!
19
Case Study: Paint Product Stewardship Initiative Background on…
Your table is the team that will build evaluation into the MN program.
Describing the MN program
Mission
Goals and objectives
Logic model: we are going to make one!
Program
Select and Describe the Program
1. Team 2. Mission3. Goals & Objectives4. Logic Model
Integrating Evaluation into Program Design
Describe the Program: Logic Model
VictoryCommitment TrainingSnodgrass Juggling Regimen Me
Instructions: Each table will craft a line of logic based on one goal (long-term outcome) of the MN project. For each component of the model (e.g. activity, output, outcome), brainstorm with your group to decide on 2-3 items to complete your line of logic.
Resources Activities Outputs Customers Short Term Intermediate Long Term
Outcomes
1. Team 2. Mission3. Goals & Objectives4. Logic Model
Integrating Evaluation into Program Design
Program
Questions
What are the critical questions to understanding the success of the MN program?
Use an outcome from your logic model to create your evaluation question
Evaluation Questions
1. Team 2. Mission3. Goals & Objectives4. Logic Model
Integrating Evaluation into Program Design
Program
Questions
What contextual factors may influence the answers to each question?
Who are the audiences for each question?
•What’s the best way to communicate with each audience?
•How might each audience use the answer to each question?
Evaluation Questions
1. Context2. Audience 3. Communication4. Use
31
Evaluation Questions
What are the critical questions to understanding the success of the MN program?
Use an outcome from your logic model to create your evaluation question.
What contextual factors may influence the answers to each question?
Who are the audiences for each question?• What’s the best way to communicate
with each audience?• How might each audience use the
answer to each question?
1. Team 2. Mission3. Goals & Objectives4. Logic Model
Integrating Evaluation into Program Design
Program
Questions
1. Context2. Audience 3. Communication4. Use
1. Team 2. Mission3. Goals & Objectives4. Logic Model
Integrating Evaluation into Program Design
Program
Questions
1. Context2. Audience 3. Communication4. Use
What can we measure to answer each question?
Where can we find the information for each measure?
How can we collect the information?
Given our questions and information to be collected, what will be an effective collection strategy?
Performance Measures
1. Data Sources2. Collection Methods & Strategy3. Analysis Tools4. Data Collection5. Data Management
Measures
What analytical tools will give us the most useful information?
How will we implement the collection strategy?
How will we manage the data?
Performance Measures
1. Team 2. Mission3. Goals & Objectives4. Logic Model
Integrating Evaluation into Program Design
Program
Questions
1. Context2. Audience 3. Communication4. Use
1. Data Sources2. Collection Methods
& Strategy3. Analysis Tools4. Data Collection5. Data Management
Measures
34
Performance Measures
What can we measure to answer each question?
What methods are best suited for each measure?
What analytical tools will give us the most useful information?
Given our questions and information to be collected, what will be our collection strategy?
• How will we implement the collection strategy?
• How will we manage the data?
1. Team 2. Mission3. Goals & Objectives4. Logic Model
Integrating Evaluation into Program Design
Program
Questions
1. Context2. Audience 3. Communication4. Use
1. Data Sources2. Collection Methods
& Strategy3. Analysis Tools4. Data Collection5. Data Management
Measures
1. Team 2. Mission3. Goals & Objectives4. Logic Model
Integrating Evaluation into Program Design
Program
Questions
1. Context2. Audience 3. Communication4. Use
1. Data Sources2. Collection Methods
& Strategy3. Analysis Tools4. Data Collection5. Data Management
Documentation: Methodology & Policy
Evaluation Methodology
The process of integrating evaluation generates a framework for a methodology and an evaluability assessment