Top Banner
AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence: The Case of International Agricultural Research
31

AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

Dec 23, 2015

Download

Documents

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009

Use of Impact Evaluation for Organizational Learning and Policy Influence: The Case of International

Agricultural Research

Page 2: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

Overview/IntroductionOverview/Introduction

• Use and non-use of impact evaluation: the CGIAR case Douglas Horton & Ronald Mackay, Independent evaluation consultants

• Towards a broader range of impact evaluation methods for collaborative research: report on a work in progress Patricia Rogers, Royal Melbourne Institute of Technology & Jamie Watts, CGIAR Institutional Learning and Change Initiative

• Role of Impact Evaluation in Moving from Research into Use Sheelagh O’Reilly, Team Leader, Impact Evaluation, Research into Use Programme

Page 3: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

ProgrammeProgramme

• Combined presentation

• Reaction from Robert Chambers, Discussant

• Q&A and Discussion

Page 4: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

Use and Non-Use of Impact Use and Non-Use of Impact

Evaluation: the CGIAR CaseEvaluation: the CGIAR Case

Douglas Horton & Ronald Mackay

Page 5: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

OverviewOverview

• CGIAR has a long history of producing high-quality impact evaluations

• However, there has been limited use of findings:

– To influence donor / investor decisions & resource allocations

– To promote learning & program improvement

• Use may be enhanced somewhat through better planning and communication, but there remain some inherent problems with all disciplinary-oriented evaluation approaches

• Other ways of evaluating and fostering learning are needed for social / institutional learning and for policy and program improvement

Page 6: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

History of IE in the CGIARHistory of IE in the CGIAR

• High estimated returns to investment in ag. research were key to establishing the CGIAR

• Hundreds of economic impact assessments report high rates of return

• CGIAR economists have contributed significantly to improving IA theory & methods

Page 7: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

From the Studies …From the Studies …

“CGI [crop genetic improvement] programmes have been outstanding investments. Few investments can come close to achieving the poverty reduction per dollar expended that the CGI programmes evaluated in this volume have realized… Any reduction in support to agricultural projects, in particular to projects designed to improve productivity, will seriously limit and hamper efforts to reduce mass poverty.”

(Evenson & Rosegrant, 2003: 496)

Page 8: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

The Emerging ParadoxThe Emerging Paradox

“Concern is growing within the donor community relating to the effectiveness of existing impact assessment research in guiding international agricultural research... donor support for agricultural research is declining, despite the credible assessments showing that investment in this area indeed has had high return.”

(Gregersen & Morris, 2003: vii)

“There is little apparent relationship between impact assessment findings and the subsequent allocation patterns of donors… those areas of research with the highest levels of assessed benefits often suffer from declining funding, while unproven areas of research and non-research investment receive rising funding shares”

(Raitzer & Winkel, 2005: ix)

Page 9: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

0

50

100

150

200

250

300

350

400

450

500

1961 1971 1981 1991 2001 Years

mil

lio

ns

of

2005

US

do

llar

s

Unrestricted

Total

Funding to International Agricultural Research Funding to International Agricultural Research (Source: ASTI Initiative)(Source: ASTI Initiative)

Page 10: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

What is Going On Here?What is Going On Here?

• Good (impact evaluation) research does not necessarily lead to policy / programme support.

• Many factors may affect policy & management decisions more than (evaluation) information).

• For any kind of evaluation to have an impact, use needs to be cultivated from the beginning.

• One type of IE may not meet all needs

Page 11: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

Some factors influencing useSome factors influencing use

1. Engagement of intended users

2. The 4 “I’s”

3. Types and levels of use

4. Attention to use

Page 12: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

Engagement of Potential/Intended UsersEngagement of Potential/Intended Users

Donors & development agencies

Policymakers

Center / program managers

Researchers

Peers

Constituents / intended beneficiaries

Page 13: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

Why engage users?Why engage users?

Engagement

Use of Findings “Process Use”

Influence on decision making

Page 14: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

Four “Is”Four “Is”

• Interests

• Ideologies

• Institutions

• Information

(Weiss, 1998)

Page 15: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

Types and Level of UseTypes and Level of Use

Type of use

Direct/instrumental

Indirect/conceptual

Symbolic

Decision level

Strategic

Structural

Operational

Page 16: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

Attention to CommunicationAttention to Communication

Multiple forms of communication

Match format to audience

Long-term involvement

Integrate evaluation into program

Guard against standardization

Involve stakeholders

Create context for dialogue

Page 17: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

SuggestionsSuggestions

View and manage IE as “evaluation,” not as “research.”

1. Plan and manage evaluations to foster specific uses.

2. Target specific policies and program related issues.

3. Explain how programmes or projects attain results in their context.

4. Use mixed methods from various disciplines as needed to respond to evaluation questions.

5. Judge them for usefulness, practicality, respect for propriety and accuracy of data and results

Page 18: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

Towards a broader range of impact Towards a broader range of impact evaluation methods…Why?evaluation methods…Why?

Agricultural research has expanded into a broader range of areas – From crop improvement to higher level

development goalsRole of the researcher in the agricultural

innovation system is changing– From center of excellence to collaborative

and capacity building approach– From transfer of technology to demand

driven, locally relevant solutionsTraditional evaluation designs may not always

be feasible or appropriate

Page 19: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

Increasingly diverse portfolioIncreasingly diverse portfolio

• Impact assessment of genetic improvement of major crops well represented

• Somewhat represented biological control of pests• Under represented in IA portfolio:

– crop and integrated pest management– livestock – natural resources management– post harvest technologies– policy and gender research

Page 20: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

Increasingly collaborative researchIncreasingly collaborative research

World Accord

UofG

IIRR

IDRCPRR

Kellogg

SERTEDESO

PROSLANTECARIAS

FUPNAPIB

Zamorano

IDHERFEPROH

IPCA

IPRA-CIAT

CARIAS

PROSLANTE

IDRC

World Accord

SERTEDESO

EDISA

ANAFAE

UofG

CIADRO

UDC-Canada

MSU-CRSP

PRGA

FUNDESO

IHDER

ASOCIAL-V

ASOCIALAGO

ASOCIAGUARE

ASOCIAL-Yorito

ASOHCIAL

FEPROH

Kellogg

IPRA-CIATPRR

Zamorano

IPCA

(i) 1996 (ii) 2003

Source: Douthwaite 2004.

Page 21: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

Increasing demand to engage intended end-Increasing demand to engage intended end-users:users:

• Increase researchers’ understanding of local issues to improve the relevance of research to local conditions

• Increase uptake and appropriate adaptation

• Incorporate local knowledge into research• Co-production of knowledge by researchers

and community members• Develop end-users’ capacity to build and

use knowledge for adaptive management

Page 22: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

Spectrum of participationSpectrum of participation

• Conventional research: scientists make the decisions alone without organized participation by end-users

• Contractual: scientists contract with end-users to participate.

• Consultative: scientists make decisions but with organized communication with end-users

• Collaborative: decision-making authority is shared between end-users and scientists. Neither party can revoke or override a joint decision.

• Collegial: end-users make decisions collectively either in a group process or through individual end-users who are in organized communication with scientists.

• End-user experimentation: end-users make the decisions without organized communication with scientists. (adapted from Lilja and Ashby)

Scope of this work

`

Page 23: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

Nature June 2008 Nature June 2008 Special issue on translational researchSpecial issue on translational research

Page 24: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

Conceptualising translational researchConceptualising translational research

[Nobel laureate Sydney] Brenner is one of many scientists challenging the idea that translational research is just about carrying results from bench to bedside, arguing that

the importance of reversing that polarity has been overlooked. “I’m advocating it go the other way,” Brenner said.

Page 25: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

Following a Recipe A Rocket to the Moon Raising a Child• Formulae are critical

and necessary

• Sending one rocket increases assurance that next will be ok

• High level of expertise in many specialized fields + coordination

• Rockets similar in critical ways

• High degree of certainty of outcome

• Formulae have only a limited application

• Raising one child gives no assurance of success with the next

• Expertise can help but is not sufficient; relationships are key

• Every child is unique

• Uncertainty of outcome remains

Complicated Complex

•The recipe is essential

•Recipes are tested to assure replicability of later efforts

•No particular expertise; knowing how to cook increases success

•Recipes produce standard products

•Certainty of same results every time

Simple

(Diagram from Zimmerman 2003)

Page 26: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

Simple Complicated Complex

Deciding impacts

Likely to be agreed

Likely to differ, reflecting different agendas

May be emergent

Describing impacts

More likely to have standardised measures developed

Evidence needed about multiple components

Harder to plan for given emergence

Analysing cause

Likely to be clear counter-factual

Causal packages and non-linearity

Unique, highly contingent causality

Reporting Clear messages

Complicated message

Uptake requires further adaptation

Page 27: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

The need for broader range of methodsThe need for broader range of methods

• Complement existing methods for Impact Evaluation (raising issues of multidisciplinary and mixed methods)

• Identify, describe, measure and value impacts

• Assess causal inference in collaborative and/or participatory projects

• Support the use of impact evaluation for learning and adaptive management

Page 28: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

Entry Points for learning and changeEntry Points for learning and change

• Knowledge, skills & attitudes– People need to want to learn and know how to engage partners in co-

creation of knowledge

• Management systems & practices– Leaders learn, value learning, and promote learning in concrete ways– Communication channels facilitate easy access to information and

knowledge sharing– Systems and structures facilitate learning

• Organizational culture – Supports and rewards reflection & learning and the application of

lessons

• External environment – Is conducive to reflection and learning from experience

Page 29: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

Visualising the connection between laboratory Visualising the connection between laboratory research and practice researchresearch and practice research

Tabak, 2005 National Institute of Dental and Crano-Facial Research,

National Institutes of Health

Page 30: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

Capacity for organizational learning Capacity for organizational learning

• Systematically gathering information• Making sense of information• Sharing knowledge and learning• Drawing conclusions and developing guidelines

for action• Implementing action plans• Institutionalizing lessons learned and applying

them to new and on-going work

Page 31: AfrEA/NONIE/3ie Conference Perspectives on Impact Evaluation March-April 2009 Use of Impact Evaluation for Organizational Learning and Policy Influence:

Research Into Use ProgrammeResearch Into Use Programme

How can innovation-system approaches promote and facilitate greater use of research-based knowledge – Maximise the poverty-reducing impact of

previous research on natural resources– Develop understanding of how innovation-

system approaches contribute to reducing poverty whilst ensuring effective and efficient management of natural resources.

• Challenges to impact evaluation– need to identify critical success factors – coherent approaches for spotting ‘potential

winners’ among research outputs, in the move from research into innovation

– mainstream use of new technologies that contribute to poverty reduction and economic growth.