Expected Outputs & Lessons Learned Léa Hakim & Neda Jafar Statistics Coordination Unit, ESCWA Workshop on Building Country Capacity to Maintain a Central.

Post on 27-Mar-2015

222 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

Transcript

Expected Outputs&

Lessons Learned

Léa Hakim & Neda JafarStatistics Coordination Unit, ESCWA

Workshop on Building Country Capacity to Maintain a Central Repository of Data

Amman 13-15 March 2007

Expected Outputs

By the end of today’s session:

1. Preliminary Report

2. Action plan for clean-up

Expected Outputs

1. Preliminary Report

2. Action plan for clean-up

Preliminary Report

Consolidation of working session outputs as per templates shared:

• Status:– Database Management– DevInfo as a Central Repository of Data (CRD)– Data coverage– Linking producers and users of data– Dissemination

Preliminary Report

• Structure and Content Review Summary:

1. DevInfo summary report (Excel sheets)

2. MDG indicator I-U-S analysis (template provided)

1. DevInfo summary report:Categories (Current – Proposed)

• Indicator

• Unit

• Subgroup

• I-U-S

• Time period

• Area

• Sector

• Goal

• Framework

• Institution

• Theme

• Convention

• Source

Specify Indicator Classification(s) employed

1. DevInfo summary report:Review of subgroups

Source: Report on DevInfo 2004 Technical Support for UNDG Project “Building Capactiy and Statistical Literacy for MDG Monitoring”, Community Systems Foundation p.12

1. DevInfo summary report:Review of sources

Source: Report on DevInfo 2004 Technical Support for UNDG Project “Building Capactiy and Statistical Literacy for MDG Monitoring”, Community Systems Foundation p.9

2. MDG Monitoring Indicator I-U-S Analysis

Source: Report on DevInfo 2004 Technical Support for UNDG Project “Building Capactiy and Statistical Literacy for MDG Monitoring”, Community Systems Foundation p.19

2. MDG Monitoring Indicator I-U-S Analysis

• Complete template in reviewing data content

• Customize area categories

• Include review of whether changes implemented for final report [ Column “Action Taken (Y, N)” in template shared]

Note: Electronic I-U-S Analysis template distributed.

Preliminary Report

• Conclusions for next steps

– Outline decisions taken for next steps during this training

• Ex. Focus on MDG indicators only for first launch

• Ex. Establishing area codes at sub-national level

• Ex. Adding proxies/national development indicators

Final Report Outline

I. IntroductionII. ObjectivesIII. Status of CRDIV. Structure and Content

ReviewV. Conclusions VI. Challenges in clean-upVII. Lessons learnedVIII. Evaluation report

(M&E UNICEF)Annexes

Include Action Plan Timetable

Preliminary Report

Note: Electronic report template distributed.

Expected Outputs

1. Preliminary Report

2. Action plan for clean-up

Beyond CRD Training

1. Complete database (DB) Review

2. Complete DevInfo DB cleanup

3. UNICEF M&E officer evaluation

4. Submission of final report

5. Use and dissemination of DB

6. Country Roll-out

2. Action plan

Main Task Task Components

Person(s) responsible

March April

I. Full conversion to DevInfo v5.0

II. Completion of DB review

III. Implementation of cleanup

IV. Evaluation

V. Submission of report

VI. Launch of national DevInfo DBNote: Detailed electronic action plan template distributed.

Lessons Learned

Strategy and Planning

• Exchange of experiences • Should invest in preparation prior to release:

– Setup (complete DevInfo team w/ clear functions & responsibilities)

– Necessity of Technical/Steering Committee– Establish (feedback between DevInfo team, statistics

producers, UN agencies).– Clear strategy and plan of action– Emphasis on quality– Prioritize MDG DB– Raise awareness of decision-makers to make use of

available statistics

Database Management

• Start with planning process• Start small, think BIG!• Importance of establishing:

– DevInfo team– Committees: Steering and Technical– Formal agreements with line ministries

• Involve users from beginning• DB management requires experts for assessing data and

metadata

DevInfo as CRD

• DevInfo is user-friendly as a program and interface• Web-enabling option• Institutional link between DBs and DevInfo• Customized national DevInfo• Use of GIS• Mapping facility• Data organized by themes for targeted policy-making• Store other activities’ data such as surveys & censuses• Ready-tailored metadata and indicators of MDG framework• Availability of international sources of data

Data coverage

• Sex disaggregated data• Small area disaggregated data• Time series• Different sources• Strengths and weaknesses of DB• Establish processes for data collection• Review of list of indicators in national database relative to

global MDG list• Addition of country-specific development indicators• Review of meta-data on indicators, sources• Review of geographical areas/maps

Producer – User dialogue

• Establish linkages with line ministries

• Regular forums

• Unification of standards and methods

• Unification of classifications and definitions

• Data-entry by line ministries through web

• Train users to understand statistics, indicators, and analysis of data

Dissemination

• Press releases• Web-enabling• Dissemination through workshops• Addition of DevInfo links to NSO sites (including training

material)• Provision of DevInfo CD with national MDGR• Knowledge transfer• Roll-out advocacy campaign

Structure & Content

• Spell Check• Thematic databases• Always refer to global DB structures - Import MDG indicators and goals

from global DB – do not type.• Planning of DB content is essential• Continuous check of data quality• Metadata• Follow standard naming of source• Document all changes made• Assessment of data availability has to be done BEFORE creating a

template • Source of data should specify the original producer ex. MOH for health

indicators• Naming of source should follow “PAL_MOH_DHS_2003”• Set up different DBs for big amounts of info ex.census & surveys• Remove indicators with no data values to avoid over reporting

Mapping

• Continent and country level: apply ISO coding

• Ensure each area ID is connected to area name and vice versa

• Start working towards a target for infrastructure maps

• Area ID is the key to the mapping

Data Quality

• During data entry preferable to insert value not formula

• Review data and consistency – do not rely on automated ways

• Do not include indicators with missing values

• Avoid duplication

• Ensure completeness of data values and time series

Thank you

top related