Top Banner
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14
24

Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.

Jan 20, 2016

Download

Documents

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.

Impact Evaluation in Education

Introduction to

Monitoring and Evaluation Andrew Jenkins

23/03/14

Page 2: Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.

I am cutting rocks

http://img359.imageshack.us/img359/7104/picture420bt2.jpg

The essence of theory of change – linking activities to intended outcomes

I am building a temple

Page 3: Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.

Theory of change

“the process through which it is expected that inputs will be converted to expected outputs, outcome and impact”

DfID Further Business Case Guidance “Theory of Change”

Page 4: Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.

Theory of change

Start with a RESULTS CHAIN

Page 5: Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.

The results chain: tips

Activities Outputs Outcomes

We produce Influence Contribute to

We control We control ClientsClients control

We are accountable for We expect Should occurr

100% attribution Some attribution Partial attribution

Readily changed Less flexibility to change Long term

Delivered annually By end of program Long-term

Page 6: Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.

Monitoring – activities and outputs

Page 7: Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.

Personal Monitoring Tools

Page 8: Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.

No monitoring - blind and deaf

Page 9: Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.

Monitoring and Evaluation

Monitoring Efficiency Measures how productively inputs (money, time,

personnel, equipment) are being used in the creation of outputs (products, results)

An efficient organisation is one that achieves its objectives with the least expenditures of resources

Evaluation Effectiveness Measures the degree to which results / objectives

have been achieved An effective organisation is one that achieves its

results and objectives

Page 10: Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.

10

All Most Some

InputsOutputs

Outcomes

MONITORINGfocused on project process

(per individual project)

EVALUATIONfocused on effectiveness of project process

(for many projects)

Resources Staff Funds Facilities Supplies Training

Project deliverables achieved “Count” (quantified) what has been done

Short and intermediate effects.

Long term effects and changes

Page 11: Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.

Resist temptation, there must be a better way!

• Clear objectives• Few key indicators• Quick simple methods• Existing data sources• Participatory method• Short feed-back loops• Action results!

Page 12: Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.

Monitoring/Evaluation objectives must be SMART

• Specific• Measurable• Achievable • Realistic• Timed

(see 10 Easy Mistakes, page 5)

Page 13: Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.

Evaluation: who evaluates whom?

The value of a joint approach

Page 14: Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.

The Logical Chain

1.Define Objectives (and Methodology)

2.Supply Inputs

3.Achieve Outputs

4.Generate Outcome

5. Identify and Measure Indicators

6.Evaluate by comparing Objectives

with Indicators

7.Redefine Objectives (and Methodology)

Page 15: Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.

www.brac.net

Impact Evaluation

• An assessment of the causal effect of a project, program or policy beneficiaries. Uses a counterfactual…

Impacts = Outcomes - What would have

happened anyway

Page 16: Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.

www.brac.net

When to use Impact Evaluation?Evaluate impact when project is:

•Innovative•Replicable/ scalable•Strategically relevant for reducing poverty•Evaluation will fill knowledge gap•Substantial policy impact•Use evaluation within a program to test alternatives and improve programs

Page 17: Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.

www.brac.net

Impact Evaluation Answers

What was the effect of the program on outcomes?

How much better of the beneficiaries because of the

program policies?

How would outcome change if changed program design?

Is the program cost-effective?

Page 18: Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.

www.brac.net

Different Methods to measure impact evaluation

Randomised Assignment – experimental

Non Experimental: Matching Difference-in-Difference Regression Discontinuity Design Instrumental Variable / Random

Promotion

Page 19: Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.

www.brac.net

Randomization

• The “gold standard” in evaluating the effects of interventions

• It allows us to form a “treatment” and “control” groups– identical characteristics – differ only by intervention

Counterfactual: randomized-out group

Page 20: Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.

www.brac.net

Matching• Matching uses large data sets and heavy

statistical techniques to construct the best possible artificial comparison group for a given treatment group.•Selected basis of similarities in observed characteristics•Assumes no selection bias based on unobservable characteristics.

Counterfactual: matched comparison group

Page 21: Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.

www.brac.net

Difference-in-difference• Compares the change in outcomes

overtime between the treatment group and comparison group

• Controls for the factors constant overtime in both groups

• ‘parallel trends’ in both the groups in the absence of the program

Counter-factual: changes over time for the non- participants

Page 22: Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.

www.brac.net

Design When to use

Randomization ► Whenever possible► When an intervention will not be

universally implemented

Random Promotion ► When an intervention is universally implemented

Regression Discontinuity ► If an intervention is assigned based on rank

Diff-in-diff ► If two groups are growing at similar rates

Matching ► When other methods are not possible► Matching at baseline can be very useful

Uses of Different Design

Page 23: Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.

www.brac.net

Qualitative and Quantitative Methods

Qualitative methods focus on how results were achieved (or not).

They can be very helpful for process evaluation.

It is often very useful to conduct a quick qualitative study before planning an experimental (RCT) study.

Page 24: Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.

Thank you!