Evaluating Communication Programmes, Products and Campaigns: 1 day training workshop for communication professionals Glenn O’Neil [email protected]www.owlre.com Workshop originally conducted for Gellis Communications (www.gellis.com ) in Brussels on 30 October 2009
69
Embed
Evaluating Communication Programmes, Products and Campaigns: Training workshop
A one day workshop on evaluating communication programmes, products and campaigns. The main steps and methods are covered with real life examples given. This workshop was originally conducted by Glenn O'Neil of Owl RE for Gellis Communications in Brussels in October
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Workshop originally conducted for Gellis Communications (www.gellis.com) in Brussels on 30 October 2009
2
1. Introduction & definitions
2. Five steps of evaluation
3. Campaign evaluation methodology
4. Programme evaluation methodology
5. Product evaluation methodology
6. Reporting on communications evaluation
Schedule
3
Communication professionals understand the
key concepts of communications evaluation
and thus increase their effectiveness in
managing evaluation aspects of their projects!
Training objective
4
“Evaluation is the systematic assessment of the operation and/or the outcomes of a program or policy compared to a set of explicit or implicit standards, as a means to contributing to the improvement of the program or policy”source: “Evaluation”, by Susan Weiss, (1998)
“A form of research that determines the relative effectiveness ofa public relations campaign or program by measuring program outcomes (changes in the levels of awareness, understanding, attitudes, opinions, and/or behaviours of a targeted audience orpublic) against a predetermined set of objectives that initiallyestablished the level or degree of change desired”source: “Stacks, D. (2006). Dictionary of Public Relations Measurement and Research. Institute for Public Relations.”
What is evaluation?
5
Programmes, projects, campaigns and activities that are dedicated to the management of communications between an organisation and its publicsSource: Grunig, J. (ed.) (1992). Excellence in Public Relations and Communications
What is communications?
6
– A programme is an organised set of communication
activities based on target audiences, themes or functions running continuously or for long periods
– A campaign is an organised set of communications activities, directed at a particular audience usually
within a specified period of time to achieve specific
outcomes
– A product is an individual object, such as a
publication, website or video created to support a communication activity
1 square = 1 media release or update 2 squares = 2 media releases or updates
issued on that day issued on that day
30
– Media monitoring measures visibility of an
issue or organisation in the media
– Most monitoring counts mentions of
keywords in a pre-selected group of media
using automated software
– Media monitoring can be an indication of
levels of awareness amongst publics – but
it is not a replacement!
Media monitoring
31
Media monitoring - example% of coverage
32
Media monitoring - exampleNo. of articles
0
500
1000
1500
2000
2500
nov
dec jan
feb
mar
april
may
june
july
aug
sept
octnov
dec jan
Campaign 2008/9 Campaign 2005/6
33
– Web metrics is data collected by
automated software on visits and other
actions on web sites
– This can be both for an organisation’s
website or a sector
– Web metrics can measure different
variables including interests, preferences,
interaction and online behaviors
Web metrics
34
Web metrics - example
Language – no. articles (%) Language of visitors (%)
English 90 69
Chinese 6 15
Russian 3 4
French 0.5 3
German -- 2
Spanish
--
2
Other -- 5
Combination of content analysis (language content) with web metrics (language of visitors per computer settings) for online portal
35
– Tracking mechanisms record actions
taken on issues, policies, legislation, etc.
– Tracking mechanisms are usually
manually tracked on standard forms in a
systematic manner
Recording how many partners join a
campaign
Tracking and recording the number of
business leaders that speak out on an
issue
Tracking mechanisms
36
Tracking mechanisms - exampleCampaign year
37
– Network mapping measures the relations
and flow between people, ideas and
organisations
– Network mapping is useful in measuring
growth of networks and interconnectivity
between publics and issues
Network mapping
Network mapping - example
Before After
Conference participants – networks
Network mapping - example
No content
Legend
Mostly out-of-date Mostly up-to-date
Size of square indicates number of visits Connecting lines indicate users have visited both directories
Thickness of connecting line indicate number of users that have visited both directories
Network map of directories of online portal combining web metrics (number of visits per directory), content analysis (data updated or not) and user survey data (visits to which
• Is the product considered to be of high quality in
terms of design, usability and content?
• Is the product targeted to the right audiences?
• Is the product available, accessible and
distributed to the intended target audiences?
Criteria for evaluation
53
– Evaluation questions often include (cont.):
• Is the product used in the manner for which it
was intended - and for what other unintended
purposes?
• What has the product contributed to broader
communication and organizational goals?
• What lessons can be learnt for improving future
editions of the product and design, distribution and promotion in general?
Criteria for evaluation
54
Surveys
Interviews
Focus groups
Case studies
Observation studies
Evaluation methods
Expert reviews
Content analysis
Web metrics
Tracking mechanisms
Distribution statistics
The evaluation methods have to be adapted to
the type of product and can include:
55
Distribution statistics - example
Student / teacher
Fax
orders
5%
National
offices
30%
Promotional
distribution
40%
Web
orders
25%
Local partners
NGOsMedia
Student / teacher
Fax
orders
5%
National
offices
30%
Promotional
distribution
40%
Web
orders
25%
Local partners
NGOsMedia
56
Mapping use - example
Training
Resource Working tool
Policy support
Used for training of
national partners
Used for staff training
Develop teaching
materialsCreate presentations for clients
Charts & tables used in
production
Used as guidelines
for product design
Used by NGOs to
influence debates on regulations
Used by authorities
to revise guidelines
Product
57
Mini case study - exampleCapacity building for women in Uzbekistan, Central Asia
Nargiz, portal member, Uzbekistan
In Uzbekistan, Nargiz, a portal member is part of a group of 50 women who were preparing to run in parliamentary elections. For her, the portal has been a valuable source of support and information.
“In the e-discussions I got important feedback on fundraising strategies and financing of campaigns. This information will be used!”
Nargiz especially mentions an interesting experience from Mauritius, shared on the portal, which she can apply in her daily work at the Women’s NGO Forum of her country
“The material on capacity building is also very useful for us. I hope that in the future, we can share more of our own resources with the network.”
58
Devising precise questions
• In all communications evaluations, if an evaluation
framework exists, it should be relatively easy moving from indicators to questions or criteria for collecting data
• But these questions and criteria must be created, documented and shared with the persons undertaking
the evaluation
• Questions and criteria would normally be documented in
A Good Evaluation Report is… A Weak Evaluation Report is…
• Impartial • Credible • Balanced • Clear and easy to understand • Information rich • Action oriented and crisp • Focused on evidence that supports conclusions
• Repetitious • too long • Unclear and unreadable • Insufficiently action oriented • Lacking hard data and relying on opinion • Poorly structured and lacking focus on key findings• Lacking comprehension of the local context • Negative or vague in its findings
64
Summary sheet: an example
65
Scorecard:an example
66
Findings table:an example
Summary of the review’s key findings
Expected Results Rating
Outputs
Eight directories of the CR established and
accessible to potential users from the disaster
management community worldwide.
Largely achieved
Eight directories of the CR stocked with relevant,
appropriate and up-to-date information on disaster
management capacities.
Only partially achieved
Outcomes
Potential users from the disaster management
community worldwide learned of the CR.
Very limited achievement
Potential users from the disaster management
community worldwide visited the CR and registered
Very limited achievement
Users obtained information of use to them in one or
more of the eight directories of the CR.
Only partially achieved
Users contributed information from their
organisations to one or more of the eight directories
of the CR.
Very limited achievement
Information found on the CR facilitated the rapid
identification of appropriate disaster management
services.
Very limited achievement
Information found on the CR contributed to the rapid
delivery of humanitarian emergency assistance.
Not achieved
Impact
Delivery of humanitarian emergency assistance
improved. Not measured in this review
67
Video report: an example
http://www.youtube.com/watch?v=q6nKXcUrNXA
68
Follow-up mechanisms
Evaluations may require follow-up
mechanisms to ensure that the findings are disseminated and acted upon, including:
– Workshops with staff and donors to discuss findings
– Steering committees to discuss findings and implementation
– Plans of action based on findings and recommendations of the evaluation
69
A parting quote
Scientific quality is not the principle standard; an
evaluation should aim to be comprehensible, correct and complete, and credible to partisans on all sides