Top Banner
1 WORKSHOP REPORT MONITORING AND EVALUATION TRAINING WORKSHOP FOR CIVIL SOCIETY ORGANISATIONS IN SIERRA LEONE DATE: 24 – 26, MAY, 2011 VENUE: KONA LODGE, FREETOWN, SIERRA LEONE WEST AFRICA CIVIL SOCIETY INSTITUTE
24

Monitoring and Evaluation Training Narrative Report, Sierra Leone (May, 2011)

Oct 04, 2014

Download

Documents

wacsi
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Monitoring and Evaluation Training Narrative Report, Sierra Leone (May, 2011)

1

WORKSHOP REPORT

MONITORING AND EVALUATION TRAINING

WORKSHOP FOR CIVIL SOCIETY ORGANISATIONS IN

SIERRA LEONE

DATE: 24 – 26, MAY, 2011

VENUE: KONA LODGE, FREETOWN, SIERRA LEONE

WEST AFRICA CIVIL SOCIETY INSTITUTE

Page 2: Monitoring and Evaluation Training Narrative Report, Sierra Leone (May, 2011)

2

TABLE OF CONTENTS

Introduction 3 Session 1: Introduction to Monitoring and Evaluation 5 Session 2: Planning, Monitoring and Evaluation (RBM) 6 Session 3: What you need to Know about Your Programme 8 Session 4: Indicators 9 Group Exercise 1: LogFRAME 11 Session 5: Data Sources and Tools 13 Session 6: Stakeholder Analysis 14 Group Exercise 2: Preparing a Stakeholder Matrix 14 Session 7: Developing Your Monitoring and Evaluation Action Plan 16 Session 8: Sharing Your Monitoring and Evaluation Findings 17 Closing Remarks 19 Appendix 1: Programme Agenda 20 Appendix 2: List of participants 23

Page 3: Monitoring and Evaluation Training Narrative Report, Sierra Leone (May, 2011)

3

1. INTRODUCTION

The West Africa Civil Society Institute (WACSI) in collaboration with the OSIWA Sierra Leone Country Programme organised a Specialised Workshop on Monitoring and Evaluation for Civil Society Organisations in Sierra Leone. The workshop was targeted at strengthening the capacity of OSIWA partners and grantees to professionally assess the efficiency, effectiveness and impact of their projects, interventions and organisational performance. This report documents the proceedings and outcomes of the workshop. Training Objectives The specific objectives of the workshop were: To help trainees understand the fundamentals of designing and implementing a

monitoring and evaluation system for a project or an organisation; and To provide a comprehensive overview of a sample of monitoring and evaluation

tools, methods, and approaches, including their purpose and use; advantages and disadvantages and key references.

Training Methodology The training workshop was delivered using interactive, learner-centered methods, audio visual tools, experiential learning, and practical exercises. Participants shared “real life” organisational experiences.

Training Areas The three day workshop covered 10 sessions. The specific training areas include:

Session 1: Introduction to Monitoring and E valuation Session 2: Planning, Monitoring and Evaluation (RBM) Session 3: What you need to Know about Your Program Session 4: Indicators Session 5: Data Sources and Tools Session 6: Stakeholder Analysis Session 7: Developing Your Monitoring and Evaluation Action Session 8: Sharing Your Monitoring and Evaluation Findings Expected Training Outcomes At the end of this specialised training, all participants will learn to: Use monitoring and evaluation as a key tool to help make management decisions; Use data to more effectively manage programmes and resources; Develop a draft monitoring and evaluation action plan for programmes; Build on what participants already know and do; Enhance your understanding of monitoring and evaluation; Help participants develop knowledge and skills to monitor and evaluate your

programs; Expand participants understanding of how monitoring and evaluation can be used to

help you make management decisions; and

Page 4: Monitoring and Evaluation Training Narrative Report, Sierra Leone (May, 2011)

4

Develop participants’ skills and confidence to communicate program results and lessons learned.

Opening Remarks Capacity Building Officer, WACSI The workshop commenced with a welcome message by the Capacity Building officer of WACSI, Charles Kojo Vandyck. The Officer welcomed the participants. The Officer began with a brief presentation on WACSI’s mandate. The Officer emphasised that WACSI was created to reinforce the capacities of civil society and to respond to operational deficiencies within civil society. The Officer added that WACSI conceptualised this specialised monitoring and evaluation course to strengthen the capacity of civil society practitioners to understand, design and undertake impact assessments of projects and their respective organisations. Mr. Vandyck acknowledged the resourcefulness of the facilitator and urged participants to make the best out of the workshop.

The Officer concluded that by the end of the three days training, the course will provide the trainees with the requisite and relevant monitoring evaluation competencies. Country Manager, OSIWA Sierra Leone The Country Manager of OSIWA Sierra Leone welcomed the participants and thanked them for responding to the invitations to attend the workshop. The Country Manager also welcomed WACSI informing the participants that it was the first time the Institute was organising a training initiative specifically for civil society organisations in Sierra Leone. The Manager highlighted on the importance of the monitoring and evaluation workshop for the Country Office and its partners. Madam Kabbah concluded by wishing the participants fruitful learning experience and a successful workshop.

Participant expectations of the workshop The workshop began with participants articulating their expectations for the workshop. The expectations were:

To fully understand the monitoring and evaluation process; To disseminate the knowledge acquired about monitoring and evaluation to other

staff members; To receive valuable knowledge about monitoring and evaluation from the

experiences of other participants; To attain basic monitoring and evaluation skills;

Page 5: Monitoring and Evaluation Training Narrative Report, Sierra Leone (May, 2011)

5

To understand the conceptual definitions and explanations of monitoring and evaluation;

To be equipped with monitoring and evaluation methodologies and tools; To acquire knowledge about how to evaluate the expected outcomes as per the real

outcomes; To strengthen relationships with other civil society practitioners through networking

and experience sharing; To acquire the skills to set-up monitoring and evaluation systems for projects and

institutions; To reinforce the ability to utlise the log frame for monitoring and evaluation

purposes; and To strengthen the ability to articulate and communicate results from impact

assessment data gathering exercises.

2. SESSION ONE: INTRODUCTION TO MONITORING AND EVALUATION After going through this session, participants will be able to: Define monitoring and evaluation; Understand the monitoring and evaluation logical framework; and Explain some terminologies in monitoring and evaluation. Monitoring is a continuous process of collecting and analysing information to compare how well a project; programme or policy is performing against expected results. To monitor is to look at what is being done. Monitoring deals mostly with two levels: Inputs and Outputs. Inputs are people, training, equipment and resources that we put into a project, in order to achieve outputs. Outputs are the activities or services we deliver Evaluation is an assessment of a planned, ongoing, or completed intervention to determine its relevance, efficiency, effectiveness, impact and sustainability. The intent is to incorporate lessons learned into the decision-making process. To evaluate is to assess the value of something periodically. To evaluate is to examine what has been achieved. Evaluation deals with: Outcome and Impact. Outcomes are changes in behavior and/or skills. Impacts are outcomes intended to lead to improvement in a specific indicator.

Monitoring Evaluation

Continuous: day-to-day Periodic: important milestones

Documents progress In-depth analysis of achievements

Focuses on inputs and outputs Focuses on outcomes and impacts

Alerts managers to problems Provides managers with strategy and policy options

Self-assessment External analysis

Page 6: Monitoring and Evaluation Training Narrative Report, Sierra Leone (May, 2011)

6

The Logical Framework The logical framework (LogFRAME) is a planning and management tool used for systematic and logical thinking for: Planning projects; Monitoring projects; and Evaluating projects. It connects a project’s means with its end. The LogFRAME has the following functions: Communicate project’s objectives clearly and simply; Ability to incorporate the views of all the stakeholders of a project; Tool to summarize the key features of a project design; and It is an upfront planner that provides essential information. The LogFRAME should be: Concise - normally not longer than 2 sides of paper; Easy to understand for those sighting it for the first time. No acronyms; Include beneficiaries in the design of the LogFRAME; and A basis for monitoring and evaluation – must be reviewed and amended regularly. The benefits of a LogFRAME include: Brings together in one place a statement of the key components of a project; It presents them in a concise and coherent way (clarifies and shows logic of how the

project is expected to work); It separates the various levels in the hierarchy of objectives (helps to avoid confusion

of inputs and outputs); It clarifies the relationships which underlie judgments about likely efficiency and

effectiveness of projects; It identifies the main factors related to the success of the project; It provides a basis for monitoring and evaluation by identifying indicators of success

and means of quantification or assessment; and It encourages a multidisciplinary approach to project preparation and supervision. 3. SESSION TWO: PLANNING, MONITORING AND EVALUATION (RBM)

After going through this session, participants will be able to:

Understand results-based management and results-based management evaluation;

Understand the steps involved in project planning;

Link projects and monitoring and evaluation;

Undertake planning for your project; and

Explain the ten steps for building an monitoring and evaluation system.

Results-Based Management (RBM) focuses on tangible results to be delivered. It also

clarifies clients and the mandate of an organisation and promotes benchmarking and

performance analysis. It emphasises value-for-money.

Page 7: Monitoring and Evaluation Training Narrative Report, Sierra Leone (May, 2011)

7

A results-based monitoring and evaluation is an exercise to assess the performance of an

institution and/or a programme or a project, on the basis of impacts and benefits that

the institution and/or the programme/project is expected to produce.

Difference Between Results-Based Monitoring and Evaluation (RBME) and

traditional Monitoring and Evaluation

Traditional M&E measures and reports status of results (a reactive tool) RBME measures and reports results to produce results (proactive tool) RMBE is often seen as a dynamic tool of planning and budgeting for improving

substantive performance and achieving results Results-Based Management Principles Ownership is key in formulating and implementing programs and projects; Who benefits from your programme or project; Stakeholder Engagement- it is vital to engage stakeholders and promote buy-in; Who are your stakeholders and have they bought into your project; and Focus on results- Planning, monitoring and evaluation should ensure the

achievement of results. Planning Planning involves identifying the vision, goals, or objectives to be achieved, formulating the strategies needed to achieve the vision and goals, determining and allocating the resources (financial and other) required achieving the vision and goals, outlining implementation arrangements for monitoring and evaluation progress towards achieving the vision and goals. Initial Design Stage Assess feasibility, scope and rationale of project; Determine the goal and objectives; Outline main project outputs and key activities; Outline project implementation process and structures; Outline the M&E system; and Develop the budget and specify staffing levels. Start-up Phase Develop understanding of project goals and objectives with key stakeholders; Review and revise the initial design; Design and plan work in sufficient detail to allow for implementation; and Develop a detailed operational M&E system.

Page 8: Monitoring and Evaluation Training Narrative Report, Sierra Leone (May, 2011)

8

Annual Review of the Work Plan and Budget Check if the outputs, objectives and goal remain relevant; else, adjust; and Decide what activities and tasks are necessary to deliver outputs. Supervision (recurrent) Discuss overall progress of the project; Decide on changes that should be made in the annual work plan; and Assess any potential changes in the overall design that require loan agreement

negotiations. End of the Early Implementation Phase Review overall project strategy in light of early implementation experience; Develop recommendations for the work plan in the next phase; and Negotiate any significant changes to project design for the next phase. Mid-Term Review Review achievement of outputs and progress towards the purpose(s) and goal; Assess appropriateness of the overall strategy; and Redesign the project as necessary. Beginning of the Phase-Out Period Identify the priorities of final activities in order to maximise impact; and Review and adjust strategies with a view to sustained impact. 4. SESSION THREE: WHAT YOU NEED TO KNOW ABOUT YOUR

PROGRAMME After going through this session, participants will be able to: Explain the importance of data use Describe how data are used for program decisions Discuss various stakeholders’ uses of monitoring and evaluation data Develop monitoring and evaluation questions Using information involves examination and use of routinely collected programme information by programme managers and stakeholders to make decisions about a programme for the purposes of: Judgment Improvement Knowledge development Stakeholders have different information needs. Stakeholders view activities from different perspectives, have different degrees of understanding of the programme, need/want different information, need/want information at different levels of complexity and have different intensities of interest.

Page 9: Monitoring and Evaluation Training Narrative Report, Sierra Leone (May, 2011)

9

Information needs can also be expressed in the form of monitoring and evaluation questions. Did we deliver services? Did we do it well? Were sufficient numbers of staff recruited? Did we implement the services as planned? Are our staff members capable of providing effective services?

Monitoring and evaluation questions should relate to programme objectives and be specific and measurable.

5. SESSION FOUR: INDICATORS

After going through this session, participants will be able to:

Define an indicator and describe how to use it Describe how to select indicators Describe the types of Indicators Develop your program Indicators An indicator is a quantitative or qualitative factor or variable that provides a simple and reliable means to measure achievement or to reflect the changes connected with a project or an intervention. An indicator is a specific measure of program performance that is tracked over time by the monitoring system. Indicators are compared over time in order to assess change. An indicator is something that helps you understand where you are, which way you are going and how far you are from where you want to be. A good indicator alerts you to a problem before it gets too bad and helps you recognize what needs to be done to fix the problem. Indicators help determine the success or failure of an intended goal and normally are measurable. Indicators are measured in terms of percentage or proportions. Indicators are useful for measuring changes or trends over a period of time. Characteristics of Good Indicators Indicators will vary from one project to another, according to the work and its context, but in general they are often expected to be: SMART (specific, measurable, attainable, relevant and time-bound) SPICED (subjective, participatory, interpreted, cross-checked, empowering and

diverse) Key Elements of a Good Indicator Specific: An indicator must be related to the conditions that the program/project

wishes to change; Measurable: An indicator must be quantifiable and allow for statistical analysis of

the data; Appropriate: An indicator must be necessary and have relevance to the

program/project; Realistic: An indicator must be attainable at a reasonable cost using appropriate

collection methods; and

Page 10: Monitoring and Evaluation Training Narrative Report, Sierra Leone (May, 2011)

10

Time-based: An indicator must have a time period for collection clearly stated. Indicators provide a reference point for programme or project planning, management, and reporting. Indicators assess trends and identify problems and can act as early warning signals for corrective action. Types of Indicators Input indicators: The number of personnel required before establishing a

programme, the skills each personnel is expected to have, number of equipment. Process indicators: These will track the activities in which the inputs are utilized, for

example, training, what percentage of people will be trained, planning of the service delivery, how many items needed; etc.

Output indicators: here will point to the direct and immediate results of input and processes. For example the prevention measures, care and support services

Outcome indicators: At this level, indicators take a relatively longer time to be realized than the first two levels.

Impact indicators: These are measured at the end of the programme or project. Normally they take a longer time to measure than the previous ones. In other words, one can establish whether or not the established objectives have been reached by using a set of indicators. To reach to this conclusion one needs impact indicators.

Indicators are linked to goals and objectives or achievements of the project. They are therefore constructed from statements which show what should happen at a given time. It could be quantitative or qualitative. To construct an indicator is thus to have a sentence or statement that show the accomplishment. How Indicators are Constructed Goal: e.g. To have a society in which children can grow up free from the threat of

disease Objective: To reduce the rate of malaria infection among children Target: To control the breeding of mosquitoes by 15% in 5 years Indicator: Percentage of children reporting to clinics with malaria Objectives are normally indicating: the purpose, desire, intention or the ultimate

goal

Page 11: Monitoring and Evaluation Training Narrative Report, Sierra Leone (May, 2011)

11

GROUP EXERCISE 1: LOGFRAME Participants were divided into 3 groups and were tasked to formulate a logical framework.

Log FRAME Exercise To establish a Pilot Healthy Living Centre for Older People (65+ years)

The Centre is to be located in an established housing area, where 25% of the population is older people. There are few activities or social facilities available. A survey has been carried out of the residents, which indicates an urgent felt need for services and advice. A high incident of loneliness was also indicated. Funding available: $1,750,000 for capital and equipment $ 550,000 per year for 3 years for running costs The Centre will provide:

Health screening Information and advice on:

Healthy eating Exercise and Fitness Financial planning Traditional remedies Volunteering and engagement opportunities Clubs and classes

Group Results Group 1

Intervention Logic

Verifiable Indicators

Means of verification

Assumptions Risks

Goal

A society in which older people (65+ years) can be catered for and free from the threat of loneliness and To establish a Pilot Healthy Living centre for older people (65 + years)

Reduction in the number of uncared for aged or old people in community

National statistics on the aged Data from social safety net –Ministry of Labour

Older people will not be hindered by their children and or wards to be a part of the centre Government will to support the initiative

Continuity of the scheme

Objective 1. To reduce the incidence of absence of services and advice for older people.

Percentage increase of older people that are using the centre. Percentage

Records kept by centre

Buying in the idea of the centre by

older people

Page 12: Monitoring and Evaluation Training Narrative Report, Sierra Leone (May, 2011)

12

2. To establish a social safety thrift among the aged

increase of older people registering for the thrift

Target 1. To reduce incidence of absence of services and advice for older people by 10 % in 3 years

2. To reduce

dependency

on families

by 15 % in 3

years

Percentage reduction of loneliness among the aged (65+ years) Percentage increase use of services and advice from the centre

Perception survey Records from centre

Governmental support

Indicators

GROUP 2

Interventions Verifiable indicators Means of verifications

Assumption Risks

Goal:

To establish a pilot healthy living centre for older people

The Reduction of incidence of loneliness and health hazard of 15%

Data from the health centers

Purpose/Objective

To increase opportunities for access to social and health facilities for the aged

Number of social and health facilities for the aged

Established and furnished structure for the aged

contractors able to finished project within the agreed time

Changes in social and policy context

Expected resulted/target

To increase visitation &utilization of the centre by 15% in 2YRs

Number of aged accessing the centre facilities

Centre register Project quarterly report

Health & environmental conditions of the aged is stable

Getting the desired staffing capacity whose attitude and behavior are acceptable to the aged

Page 13: Monitoring and Evaluation Training Narrative Report, Sierra Leone (May, 2011)

13

Group 3

Logic Intervention Variable indictors

Means of verification

Assumption Risk

Goal To improve the livelihood of the aged.

Increased the life expectancy of the aged from 65yr to 80yrs

Birth and Death record

Aged will use the facility available Youth will be disgruntled and target the centre

Political instability

Objectives To established a well-equipped healthy living centre for the aged (65+) by May 2014

Number of aged accessing the centre

Centre records The services provided is free of cost

Sustainability of the funding

Expected Result

Increased healthy life span of the aged Improved social interaction among the aged

Reduced Number of aged visiting the hospital

Hospital centre Health worker will be willing to provide hospital records

Act ivies 1. Construction of the centre

2. Recruitment

6. SESSION FIVE: DATA SOURCES AND TOOLS Programme data may be collected through quantitative methods (numerical data) which include surveys, questionnaires and checklists. In addition, data may be collected through qualitative methods (descriptive information) which include key informant interviews, focus groups, record reviews, observations and mixed methods (mix of quantitative and qualitative methods). A wide variety of tools is available for collecting data, including formal surveys, structured or semi-structured interviews, group discussions, direct observation and case studies. Each method brings its own advantages and drawbacks. The choice of method depends on the nature and scale of the project, the type of information required, and the frequency, ease and cost of collection. Data that can be collected or measured easily by field workers (e.g. levels of beneficiary participation in meetings, the number of rainwater harvesting structures completed) can be put into monthly or quarterly reports.

Data requiring more systematic or time-consuming collection are gathered less frequently – perhaps annually.

Page 14: Monitoring and Evaluation Training Narrative Report, Sierra Leone (May, 2011)

14

7. SESSION SIX: STAKEHOLDER ANALYSIS

After going through this session, participants will be able to: Construct a stakeholder matrix Meet the information requirements of your stakeholders Understand the criteria for stakeholder participation in your program Learn how to prioritize your stakeholders

Stakeholder analysis in the context of monitoring and evaluation helps you define whom to try to involve when designing the monitoring and evaluation system and in which way, and it allows you to find out whose information needs must be considered. It can also be used to develop an appropriate sample for data collection. It can help you to identify which stakeholders to involve in (re-) designing a project and its monitoring and evaluation system, and to assess their interests and how these relate to the project and to monitoring and evaluation. It can assist you in making an appropriate selection of the stakeholders most central to the task/issue at hand. It can help provide a foundation and strategy for participation throughout the project, thereby making it easier for stakeholders to learn from each other. The main purpose of the stakeholder analysis is to agree on the criteria for assessing the stakeholders. In stakeholder analysis for monitoring and evaluation, your main purpose is "to make sure we are including all key players in developing the projects monitoring and evaluation system". Then list which criteria you will use to prioritize whom to involve. The types of criteria for selecting stakeholders could be: "supposed to be benefiting from the project", "critical role in ensuring success", "legally required to participate", "have specific knowledge on monitoring and evaluation processes" among others. GROUP EXERCISE 2: PREPARING A STAKEHOLDER MATRIX Participants were tasked to formulate a stakeholder’s matrix. Group 1

Stakeholder Stake in project

Potential impact on project

What does the project expect the stakeholder to provide

Perceived attitude and or risks

Stakeholders management strategy

Responsibility

Government MOH Social Welfare

Provides policy direction

Government can either stall or lend credence to the project

Support Change in policy Withdrawal of support for project

Consultation and meetings

MOH/ OACODA

Old age community development organization

Created the basis for the current intervention

Are the primary beneficiaries of the project

Initial beneficiaries

Refusing to access the services provided

Meetings and follow-up visits

Executive members of OACODA

Page 15: Monitoring and Evaluation Training Narrative Report, Sierra Leone (May, 2011)

15

Funder Implementation of mandate

Enhance project implementation

Funds Late disbursement of funds

Termination of funding

Meeting /consultation

Steering committee of OACODA

Health Practitioners

Provision of services

Provides quality health care services to beneficiaries

Services Strike action

Postings/transfers

Absence of medical supplies

Meetings and consultations

Sierra Leone medical and dental health Association

Youth Reduction of burden for caring for old people

Enhance project implementation

Provision of Labour

Group 2

Stakeholders

Stake in the project

Potential Impact on the Project

What does the Project expect the Stakeholder to Provide

Perceive attitudes and or Risks

Stakeholder Management Strategy

Responsibility

Ministry of Social welfare Gender and Ministry of Health and Sanitation

Policy makers and State Machinery

Provide Policy Guidelines and determines policy framework

Direct Collaboration and technical support

Lack of Political will

Involvement in Planning

Focal Persons in trhe Ministry

Inmates of King George the 2nd home

Direct beneficiaries

Determine the relevance of the Center

Availability and acceptance of the Project

Traditions and customs preventing the use of such facilities

Early Sensitization

Chair for the Aged and Matron

OSIWA

Funder Financial Contributors

Timely disbursement and monitoring of funds and activities

Delay in disbursement

Constant Communication

Country Coordinator

Leonard Cheshire Implementing partner

Technical Keep to deadlines Document information and Communicate progress so far

Lack of Capacity to implement programmes in a timely manner

Timely Reporting

Contractor Construction of the State of the Art Center

Timely Completion

Keep to deadline and communicate information

Use of sub-standard materials

Regular Monitoring

Group 3

Stakeholder Stake in the project

Potential impact on project

What does the project expect to provide

Perceived attitude/risks

Stakeholder management strategy

responsibility

Govt high Contribute on the sustainability

Policy guideline and technical expects

Govt. role highjack

Lobbying Focal point at Govt. ministry

Aged high Their involvement lead to the success

First-hand information on their status

Buy-in due to customs

Awareness raising

Project coordinator

Page 16: Monitoring and Evaluation Training Narrative Report, Sierra Leone (May, 2011)

16

Funder High Successful implementation of the project

Capital and equipment

Delay in remitting funds

Constant engagement

Country rep.

community high Providing necessary support

cooperation Willingness to per take

Awareness raising

Project coordinator and community leaders

8. SESSION SEVEN: DEVELOPING YOUR MONITORING AND

EVALUATION ACTION

After going through this session, participants will be able to: Assign staff members to work on various monitoring and evaluation tasks Draft a budget for your monitoring and evaluation activities Draft a timeline for your monitoring and evaluation activities A monitoring and evaluation action plan is a written document that explains how one will implement different monitoring and evaluation activities within a programme and the resources required to do these activities. The monitoring and evaluation plan should include the: Description of the programme; Purpose of monitoring and evaluation activities and objectives; Monitoring and evaluations questions that will be used; Description of what data will be collected; Methods for collecting, analyzing, managing and disseminating data; Instruments for gathering data; Description of data flow; Descriptions of who will implement various aspects of the plan; Resources needed to implement the plan; and Timeline for completing monitoring and evaluation activities. Monitoring and evaluation activities may be carried out by programme staff members, organizational administrative staff members, other stakeholders and external experts/consultants. Recruiting an External Consultant The following considerations must be made when working with an external consultant: Their role Monitoring and evaluation training Previous monitoring and evaluation related experience and prior experience working

with similar programs Their professional approach Whether their personal style is a good match for your organization’s needs and

culture The terms of agreement

Page 17: Monitoring and Evaluation Training Narrative Report, Sierra Leone (May, 2011)

17

Select a consultant, who knows the topic, is culturally competent, can communicate clearly with different stakeholders (e.g., project managers, front line staff, community members) There is the need to clarify the roles and responsibilities of the external consultant, negotiate and establish a contract and work plan with deliverables, timeline, fees, and publishing rights. In addition, one must participate in developing evaluation plans and final reports and dissemination plans and meet regularly to monitor progress. Use the consultant to build internal capacity. There is also the need to assign responsibilities including: Basic monitoring and evaluation tasks; Develop logic model; Develop monitoring and evaluation questions; Select/develop indicators; Develop data collection instruments; Collect data; Enter data into electronic format; Clean and check quality of data; Analyze data; Review and use findings internally; and Develop communication materials for external audience. 9. SESSION EIGHT: SHARING YOUR MONITORING AND EVLAUATION

FINDINGS

After going through this session, participants will be able to:

Select methods for sharing monitoring and evaluation findings

Identify stakeholders you will share this information with

Determine when you will share monitoring and evaluation findings

Describe various visual aids that is used to display information

The uses of monitoring and evaluation findings:

Guide service delivery

Manage and improve your program

Make decisions for the future

Inform capacity building

Gain resources

Be accountable to stakeholders

Identify best practices

Report to policy makers

Communicate successes and challenges

Page 18: Monitoring and Evaluation Training Narrative Report, Sierra Leone (May, 2011)

18

Communicating information within your programme can:

Help you and your staff members understand how and why your program is working

Highlight program strengths and accomplishments

Improve program planning

Identify gaps in program implementation

Identify future program needs

Help you make decisions about the best use of resources

Communicating information externally can: Help stakeholders and the community understand what the program is doing; Help ensure social, financial, and political support; and Help your programme establish or strengthen your network of individuals and

organizations with similar goals. Communicating information externally also can: Help raise awareness of your program among the public, policymakers, and donors Strengthen funding proposals: regular documentation and dissemination of results

and lessons learned from M&E can be impressive to donors and can serve as a basis for increasing or sustaining support for programs

Allow others to learn from your experience Contribute to a body of lessons learned and best practices that can strengthen all

programs Monitoring and Evaluation results should be disseminated and used throughout the year not just at the end. To the extent possible, dissemination should be linked to donor reporting and budget cycles. Appropriate timing can increase the attention that is given to the data. In conclusion, information is shared through several channels: Report Presentation Press conference Memorandum Success story Theatre production for non-literate audiences Radio Poster Fact sheet Brochure Marketing launch Programme review Workshop

Page 19: Monitoring and Evaluation Training Narrative Report, Sierra Leone (May, 2011)

19

Closing Remarks

The training workshop was brought to a close at the end of the final presentation.

Hannatu Kabbah, Country Coordinator of OSIWA Sierra Leone acknowledged the

importance of the workshop and thanked WACSI for a well-organized, comprehensive

and relevant workshop.

The Capacity Building Officer of WACSI, Charles Kojo Vandyck, thanked the facilitators

for an insightful and educative workshop. Mr. Vandyck expressed his appreciation to the

participants’ for their level of participation and contributions.

Participants’ were presented with certificates and learning materials at the end of the

workshop.

Page 20: Monitoring and Evaluation Training Narrative Report, Sierra Leone (May, 2011)

20

APPENDIX 1

PROGRAMME AGENDA May 24 – 26, 20011, Freetown, Sierra Leone

Venue: The Kona Lodge, the Maze, Wilberforce

DAY 1

TIME

ACTIVITY

OFFICER

RESPONSIBLE

8:30-9:30 Arrival & Introduction of Resource Persons

and Participants

OSIWA, WACSI

9:30-10:30

Session 1: Introduction to M& E (Overview

and Some Definitions)

GAB

10:30-10:45

TEA BREAK

10:45-12:30

Session 2: Planning, Monitoring and

Evaluation (RBM)

GAB

12:30-1:30

LUNCH BREAK

1:30-3:00

Session 3: What you need to Know about

Your Program

GAB

3:00-3:15

TEA BREAK

3:15 – 4:30

Session 3: What you need to Know about

Your Program Contd.

GAB

END OF DAY 1

Page 21: Monitoring and Evaluation Training Narrative Report, Sierra Leone (May, 2011)

21

DAY 2

TIME

ACTIVITY

OFFICER

RESPONSIBLE

9:00-10:30

Session 4 - Indicators

GAB

10:30-10:45

TEA BREAK

10:45-12:30

Session 5 - Data Sources and Tools

GAB

12:30-1:30

LUNCH BREAK

1:30-3:00

Session 6: Stakeholder Analysis

GAB

3:00-3:15

TEA BREAK

3:15-4:30 Session 6: Stakeholder Analysis Contd. GAB

END OF DAY 2

Page 22: Monitoring and Evaluation Training Narrative Report, Sierra Leone (May, 2011)

22

DAY 3

TIME

ACTIVITY

OFFICER

RESPONSIBLE

9:00 – 10:30

Session 7: Data analysis and Preparation

of Reports (Report Types)

GAB

10:30-10:45

TEA BREAK

10:45-12:30

Session 8:Developing Your M&E Action

GAB

12:30-1:30

LUNCH BREAK

1:30-3:00 Session 8:Developing Your M&E Action GAB

3:00-3:15 TEA BREAK

3:15-4:30 Session9: Sharing Your M&E Findings GAB

4:30-5:00

Closing Ceremony

WACSI,OSIWA

END OF DAY 3

Page 23: Monitoring and Evaluation Training Narrative Report, Sierra Leone (May, 2011)

23

APPENDIX 2

LIST OF PARTICIPANTS

No. NAME OF PARTICIPANT

ORGANISATION TEL No.

EMAIL ADDRESS

1 MARCELLA MACAULEY

CAMPAIGN FOR GOOD GOVERNANCE (CGG)

+232 33 312117

[email protected]

2 SYLVESTER AMARA

CENTRE FOR ACCOUNTABILITY AND RULE OF LAW (CARL)

+232 76 610429

[email protected]

3 EDWARD KOROMA

TRANSPARENCY INTERNATION- SIERRA LEONE CHAPTER

+232 33 445884

[email protected]

4 SHEKU JAMES NETWORK MOVEMENT FOR JUSTICE AND DEVELOPMENT (NMJD)

+232 76640301

[email protected]

5 JOHN SAHR TAYLOR

MOVEMENT FOR THE RESTORATION OF DEMOCRACY (MRD)

+232 78 914 952

[email protected]

6 JOSEPH GAMBAI PARTNERSHIP ACTION FOR GRASSROOT DEV. (PAGE SL)

+232 76 454550

[email protected]

7 EDWARD B. KANU CENTRE FOR DEMOCRACY AND HUMAN RIGHTS (CDHR)

+232 76449470

[email protected]/[email protected]

8 CHRISTIANA DAVIES COLE

LAWYERS +232 33 399968

[email protected]

9 RANDOLPH KATTA

50/50 +232 76 694917

[email protected]

10 ZAINAB JOAQUE

CORNET +232 76 515808/ 33579079

[email protected]

11 ABU INSTON MOROVIA

COALITION SECRETARIAT- OSIWA

+232 78217233

[email protected]

12 ALIE KARGBO COALITION SECRETARIAT- OSIWA

[email protected]

13 EDWARD MASSAQUOI

COALITION SECRETARIAT- OSIWA

+232 77835789

[email protected]

14 ABU BAKARR KAMARA

COALITION SECRETARIAT- OSIWA

+232 33354955

[email protected]

15 ABDUL SAMURA CENTRE FOR COORDINATION OF YOUTH ACTIVITIES

+232 76201287

[email protected]

Page 24: Monitoring and Evaluation Training Narrative Report, Sierra Leone (May, 2011)

24

16 JONATHAN PEARCE

DISTRICT BUDGET OVERSIGHT COMMITTEE

+232 77662716

[email protected]

17 AUGUSTINE KAMBO

EDUCATION FOR ALL SIERRA LEONE (EFA-SL)

+232 786844445

[email protected]

18 BERYL SARTIE TIMAP FOR JUSTICE [email protected]

19 DAVID KOROMA RACAP +232 76 575058

[email protected]

20 IDRISSA COLE ENGAGE YOUTH IN EMPOWERMENT (EYE SIERRA LEONE)

+232 77 347885

[email protected]

21 HANNATU KABBAH

OSIWA-SIERRA LEONE +23276317531 [email protected]

22 GILBERT ATTA BOAKYE

CICADA- (RESOURCE PERSON)

+233264006525

[email protected]

23 CHARLES KOJO VANDYCK

WACSI +233264128605

[email protected]

24 BETHEL KWAME BOATENG

WACSI +233244863674

[email protected]