1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation Support Division National Center for Environmental Innovation Office of Policy, Economics and Innovation U.S. Environmental Protection Agency
22
Embed
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
1
Introduction to Evaluating the Minnesota Demonstration Program
Paint Product Stewardship Initiative
September 19, 2007 Seattle, WA
Matt Keene, Evaluation Support DivisionNational Center for Environmental InnovationOffice of Policy, Economics and InnovationU.S. Environmental Protection Agency
2
Presentation Objective
Introduce the Paint Product Stewardship Initiative to the key steps in designing the demonstration program evaluation.
3
Session Agenda
Program Evaluation: Definition, Uses, Types- What is Program Evaluation?
- Why Should We Evaluate?
Steps In the Evaluation ProcessI. Select Program to Evaluate
II. Identify Evaluation Team
III. Describe the Program
IV. Develop Evaluation Questions
V. Identify Existing and Needed Data
VI. Select Data Collection Methods
VII. Select Evaluation Design
VIII. Develop Evaluation Plan
4
What is Program Evaluation?
Program Evaluation:
A systematic study that uses measurement and analysis to answer specific questions about how well a program is working to achieve its outcomes and why.
Performance Measurement:
The ongoing monitoring and reporting of program progress and accomplishments, using pre-selected performance measures.
5
Why Evaluate?
Good Program Management:
Ensure program goals and objectives are being met.
Help prioritize resources by identifying the program services yielding the greatest environmental benefit.
Learn what works well, what does not, and why.
Learn how the program could be improved.
Provide information for accountability purposes:
Government Performance and Results Act of 1993: Requires EPA to report schedules for and summaries of evaluations that have been or will be conducted and identify those that influence development of the Agency’s Strategic Plan.
Environmental Results Order 5700.7: Requires EPA grant officers and grant recipients to identify outputs and outcomes from grants and connect them to EPA’s strategic plan.
6
Steps for Designing an Evaluation
VI. Select Data Collection Methods
II. Identify Evaluation Team
III. Describe the Program
IV. Develop Evaluation Questions
V. Identify Existing and Needed Data
VIII. Develop Evaluation Plan
VII. Select Evaluation Design
I. Select Program to Evaluate
7
Assessing Whether to Evaluate Your Program (Evaluability Assessment)
1. Is the program significant enough to merit evaluation? Consider: program size, # of people served,
transferability of pilot, undergoing PART
2. Is there sufficient consensus among stakeholders on program’s goals and objectives?
3. Are staff & managers willing to make decisions about or change the program based on evaluation results?
4. Are there sufficient resources (time, money) to conduct an evaluation?
5. Is relevant information on program performance available or can it be obtained?
6. Is an evaluation likely to provide dependable information?
7. Is there a legal requirement to evaluate?(Adapted from Worthen et al. 1997.)
8
Steps for Designing an Evaluation
II. Identify Evaluation Team
III. Describe the Program
IV. Develop Evaluation Questions
V. Identify Existing and Needed Data
I. Select Program to Evaluate
VI. Select Data Collection Methods
VIII. Develop Evaluation Plan
VII. Select Evaluation Design
9
Identify Evaluation Team Members
Select diverse team members:
• Individuals responsible for designing, collecting, and reporting information used in the evaluation
• Individuals with knowledge of the program
• Individuals with a vested interest in the conduct/impact of the program
• Individuals with knowledge of evaluation
• Identify a Skeptic!
10
Steps for Designing an Evaluation
VI. Select Data Collection Methods
II. Identify Evaluation Team
III. Describe the Program
IV. Develop Evaluation Questions
V. Identify Existing and Needed Data
VIII. Develop Evaluation Plan
VII. Select Evaluation Design
I. Select Program to Evaluate
11
Describe the Program
Describe the program using a logic model
Use the logic model to:
• Check assumptions about how the program is supposed to work
• Brainstorm evaluation questions
12
13
Elements of the Logic Model
Inter-mediate
Changes in behavior, practice or decisions.
Behavior
Inter-mediate
Changes in behavior, practice or decisions.
Behavior
Customer
User of the products/ services. Target audience the program is designed to reach.
Customer
User of the products/ services. Target audience the program is designed to reach.
Activities
Things you do– activities you plan to conduct in your program.
Activities
Things you do– activities you plan to conduct in your program.
Outputs
Product or service delivery/ implementation targets you aim to produce.
Outputs
Product or service delivery/ implementation targets you aim to produce.
Resources/ Inputs:
Programmatic investments available to support the program.
Resources/ Inputs:
Programmatic investments available to support the program.
Short-term
Changes in learning, knowledge, attitude, skills, understanding.
Attitudes
Short-term
Changes in learning, knowledge, attitude, skills, understanding.
Attitudes
Long-term
Change in condition.
Condition
Long-term
Change in condition.
Condition
External Influences
Factors outside of your control (positive or negative) that may influence the outcome and impact of your program/project.
External Influences
Factors outside of your control (positive or negative) that may influence the outcome and impact of your program/project.
Outcomes
PROGRAM RESULTS FROM PROGRAM
WHYHOW
14
Outcomes
Shorter-term awareness
Intermediate behavior
Longer-term condition
OutputsActivities Customers
PPSI Demonstration ProgramProgram Goal: Design, implement and evaluate a fully-funded statewide paint product stewardship program that is cost-effective and environmentally beneficial
OUTREACH/EDUCATION• Establish relationships/partnerships• Implement education/outreach and social
marketing projects/campaign
Pro
ject
Imp
lem
enta
tio
n
Sta
ge
• Baseline information• Program database
Awareness of recycled paint and waste hierarchy improves
September 13, 2007
MEASUREMENT• Collect baseline data • Ongoing data collection• Interim analysis