Top Banner
WHAT IS PROGRAM EVALUATION? JENNIFER ANN MORROW, Ph.D. UNIVERSITY OF TENNESSEE
35

What is program evaluation lecture 100207 [compatibility mode]

Nov 02, 2014

Download

Education

Jennifer Morrow

A brief overview of the field of program evaluation
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: What is program evaluation lecture   100207 [compatibility mode]

WHAT ISPROGRAM EVALUATION?

JENNIFER ANN MORROW, Ph.D.UNIVERSITY OF TENNESSEE

Page 2: What is program evaluation lecture   100207 [compatibility mode]

Summary of Current Projects

• Program Evaluation Projects– Project Weed & Seed– Project Fairchild Tropical Gardens

• Student Development Projects– Project Social Networking– Project Writing

• Teaching Research Methods/Statistics Projects– Project RLE

Page 3: What is program evaluation lecture   100207 [compatibility mode]

Program Evaluation Philosophy

• Utilization-focused evaluation (Patton, 1996)– Evaluations are situation specific

• Comprehensive evaluation designs– Formative and summative evaluation– Qualitative and quantitative data

• Useful and meaningful data– Simple versus sophisticated analyses

• Faculty-student evaluation teams– Students as evaluation apprentices

Page 4: What is program evaluation lecture   100207 [compatibility mode]

What is Program Evaluation?

• “Program evaluation is the systematic collection of information about the activities, characteristics, and outcomes of programs for use by specific people to reduce uncertainties, improve effectiveness, and make decision with regard to what those programs are doing and affecting” (Patton, 1986).

• “Evaluation research is the systematic application of social research procedures for assessing the conceptualization, design, implementation, and utility of social intervention program” (Rossi & Freeman, 1993).

Page 5: What is program evaluation lecture   100207 [compatibility mode]

Types of Evaluation

• Formative Evaluation: focuses on identifying the strengths and weaknesses of a program or intervention.– Comprised of implementation (process) and progress

evaluation– Occurs during the entire life of the program/intervention– Is performed to monitor and improve the

program/intervention• Summative Evaluation: focuses on determining

the overall effectiveness or impact of a program or intervention.– Also called impact evaluation– Assesses if the project/intervention met its stated goals

Page 6: What is program evaluation lecture   100207 [compatibility mode]

Preparing to Conduct an Evaluation

• Identify the program and its stakeholders– Get a description of the program (i.e.,

curriculum)– Meet with all stakeholders (survey or interview

them)• Become familiar with information needs

– Who wants the evaluation?– What is the focus of the evaluation? What

resources do I have available to me?– Why is an evaluation wanted?– When is the evaluation wanted?

Page 7: What is program evaluation lecture   100207 [compatibility mode]

Program Provider/Staff Issues

• Expectation of a “slam-bang effect”.• Fear that evaluation with inhibit

creativity/innovation in regards to the program.

• Fear that the program will be terminated.• Fear that information will be misused.• Fear that evaluation will drain resources.• Fear of losing control of the program.• Fear of the program staff that they are being

monitored.

Page 8: What is program evaluation lecture   100207 [compatibility mode]

What is Program Theory?

• Program theory identifies key program elements and how they relate to each other.

• Program theory helps us decide what data we should collect and how we should analyze it.

• It is important to develop an evaluation plan that measures the extent and nature of each individual element.

Page 9: What is program evaluation lecture   100207 [compatibility mode]

Inadequate Program Evaluation Models

• Social Science Research Model: form two random groups, providing one with the service and using the other as a control group.

• Black-Box Evaluation: an evaluation that only looks at the outputs and not the internal operations of the program.

• Naturalistic Model: utilizing only qualitative methods to gather lots of data.

Page 10: What is program evaluation lecture   100207 [compatibility mode]

Theory Driven Model

• Theory-driven evaluations are more likely than methods-driven evaluations to discover program effects --- they identify and examine a larger set of potential program outcomes (Chen & Rossi, 1980).

• Theory-driven evaluations are not limited to one method (i.e., quantitative or qualitative), one data source (i.e., program participants, artifacts, community indexes, program staff), or one type of analysis (i.e., descriptive statistics, correlational analyses, group difference statistics).

• Theory-driven evaluations utilize mixed-methods and derive their data from multiple sources.

Page 11: What is program evaluation lecture   100207 [compatibility mode]

Improvement-Focused Model

• Program improvement is the focus.• Utilizing this type of model, evaluators can

help program staff to discover discrepancies between program objectives and the needs of the target population, between program implementation and program plans, between expectations of the target population and the services actually delivered, or between outcomes achieved and outcomes projected (Posavac & Carey, 1997).

Page 12: What is program evaluation lecture   100207 [compatibility mode]

Goals of an Evaluation

• Implementation Goals– Equipment needs, staff hiring and training

• Intermediate Goals– Program is delivered as planned

• Outcome Goals– Is the program effective?

Page 13: What is program evaluation lecture   100207 [compatibility mode]

Questions to ask in an Evaluation

• (1) Does the program match the values of the stakeholders/needs of the people being served?

• (2) Does the program as implemented fulfill the plans?

• (3) Do the outcomes achieved match the goals?• (4) Is there support for program theory?• (5) Is the program accepted?• (6) Are the resources devoted to the program being

expended appropriately?

Page 14: What is program evaluation lecture   100207 [compatibility mode]

Creating an Evaluation Plan

• Step 1: Creating a Logic Model

• Step 2: Reviewing the Literature

• Step 3: Determining the Methodology

• Step 4: Present a Written Proposal

Page 15: What is program evaluation lecture   100207 [compatibility mode]

Step 1: Creating a Logic Model

• Review program descriptions– Is there a program theory?– Who do they serve?– What do they do?

• Meet with stakeholders– Program personnel– Program sponsors– Clients of program– Other individuals/organizations impacted by the

program

Page 16: What is program evaluation lecture   100207 [compatibility mode]

Step 1: Creating a Logic Model

• Logic models depict assumptions about the resources needed to support program activities and produce outputs, and the activities and outputs needed to realize the intended outcomes of a program (United Way of America, 1996; Wholey, 1994).

• The assumptions depicted in the model are called program theory.

Page 17: What is program evaluation lecture   100207 [compatibility mode]

Sample Logic Model

Sample Logic Model

INPUTS ACTIVITIES OUTPUTS OUTCOMES

Resources dedicatedto or consumed by

the program

What the programdoes with the inputsto fulfill its mission

The direct productsof program activities

Benefits forparticipants duringand after program

activities

Page 18: What is program evaluation lecture   100207 [compatibility mode]

Step 2: Reviewing the Literature

• Important things to consider:– In what ways is your program similar to other

programs?– What research designs were utilized? – How were participants sampled?– Can previous measures be adopted?– What statistical analyses were performed?– What were their conclusions/interpretations?

• Creating Hypotheses/Research Questions

Page 19: What is program evaluation lecture   100207 [compatibility mode]

Step 3: Determining the Methodology

• Sampling Method– Probability vs. Non-probability

• Research Design– Experimental, Quasi-experimental, Non-

experimental• Data Collection

– Ethics, Implementation, Observations, Surveys, Existing Records, Interviews/Focus Groups

• Statistical Analysis– Descriptive, Correlational, Group Differences

Page 20: What is program evaluation lecture   100207 [compatibility mode]

Step 4: Present a Written Proposal

• Describe the specific purpose of the evaluation– Specific goals, objectives and/or aims of the

evaluation• Describe the evaluation design

– Include theories/support for design– Methodology – participants, measures,

procedure• Describe the evaluation questions

– Hypotheses and proposed analyses• Present a detailed work plan and budget

Page 21: What is program evaluation lecture   100207 [compatibility mode]

Ethics in Program Evaluation

• Sometimes evaluators will have to deal with ethical dilemmas during the evaluation process

• Some potential dilemmas– Programs that can’t be done well– Presenting all findings (negative and positive)– Proper ethical considerations (i.e., informed

consent)– Maintaining confidentiality of clients– Competent data collectors

• AEA ethical principles

Page 22: What is program evaluation lecture   100207 [compatibility mode]

Data Collection: Implementation Checklists

• Implementation checklists are used to ascertain if the program is being delivered as planned.

• You should have questions after each program chapter/section that the program deliverer fills out.

• This can then be used to create a new variable: Level of implementation (none, low, high).

Page 23: What is program evaluation lecture   100207 [compatibility mode]

Data Collection: Observations

• What should be observed?– Participants, Program staff

• Utilize trained observers.– Your observers should be trained on how to

observe staff and/or clients.

• Use standardized behavioral checklists.– You should have a standard checklist that

observers can use while observing.

Page 24: What is program evaluation lecture   100207 [compatibility mode]

Data Collection: Surveys

• To create or not to create?– Finding a valid/reliable instrument

• What can surveys measure?– Facts and past behavioral experiences– Attitudes and preferences– Beliefs and predictions– Current/future behaviors

• Types of questions– Closed versus Open-ended questions

• How to administer?

Page 25: What is program evaluation lecture   100207 [compatibility mode]

Data Collection: Existing Records

• School records– GPA, absences, disciplinary problems

• Health records– Relevant health information

• National surveys– National and state indices (e.g., census data)

Page 26: What is program evaluation lecture   100207 [compatibility mode]

Data Collection: Focus Groups/Interviews

• Focus groups or individual interviews can be conducted with program staff and/or clients.

• Can be used to obtain information on program effectiveness and satisfaction.

• Can also show if client needs are not being met.

Page 27: What is program evaluation lecture   100207 [compatibility mode]

Statistical Analysis: Quantitative

• Descriptive Statistics– What are the characteristics of our clients?– What % attended our program?

• Correlational Statistics– What variables are related to our outcomes?– How is implementation related to our outcomes?

• Group Difference Statistics– Is our program group different from our

comparison group?– Are there group differences on outcomes?

Page 28: What is program evaluation lecture   100207 [compatibility mode]

Statistical Analysis: Qualitative

• Transcribe interviews/focus groups/observations– Should be done verbatim in an organized

fashion• Summarizing all open-ended questions

– Summarize and keep a tally of how many participants give each response

• Coding and analyzing all qualitative data– Utilize a theoretical framework for coding (e.g.,

Grounded Theory)– Use a qualitative software package to organize

data (e.g., Nudist, NVivo)

Page 29: What is program evaluation lecture   100207 [compatibility mode]

Writing the Evaluation Report

• This should be as detailed as possible. It should include both the formative and summative evaluation findings as well as an action plan for improvement to the design/program.

• Should be written in an easy to understand format (don’t be too technical). For more technical information you can create an appendix.

• Include a lot of graphical displays of the data.

Page 30: What is program evaluation lecture   100207 [compatibility mode]

Presenting the Results

• You should present the results of the evaluation to all key stakeholders.

• Professional presentation (PowerPoint, handouts).

• Don’t just present findings (both positive and negative) but explain them.

• Present an action plan for possible changes.

Page 31: What is program evaluation lecture   100207 [compatibility mode]

Working as a Program Evaluator

• Get more experience– Take classes

• EP 533 (basic intro), EP 651/652 (Seminar), EP 670 (internship, can take up to 9hours), EP 693 (Independent Study) as well as many others

• Evaluation Certificate (12hrs)– Workshops

• Get Involved• Working as a Program Evaluator

Page 32: What is program evaluation lecture   100207 [compatibility mode]

References

• Chen, H.T., & Rossi, P.H. (1980). The multi-goal, theory-driven approach to evaluation: A model linking basic and applied social science. Social Forces, 59, 106-122.

• Julian, D.A. (1997). The utilization of the logic model as a system level planning and evaluation device. Evaluation and Program Planning, 20, 251-257.

• Patton, M.Q. (1986). Utilization-focused evaluation (2nd ed.). Newbury Park, CA: Sage Publications.

• Posavac, E.J., & Carey, R.G. (2003). Program evaluation methods and case studies (6th ed.). New Jersey: Prentice Hall.

• Rossi, P.H., & Freeman, H.E. (1993). Evaluation: A systematic approach (5th ed.). Newbury Park, CA: Sage Publications.

• United Way of America. (1996). Measuring program outcomes: A practical approach (Item No. 0989). Author.

Page 33: What is program evaluation lecture   100207 [compatibility mode]

Websites

• Evaluators’ Institute– http://www.evaluatorsinstitute.com/

• Guide to program evaluation– http://www.mapnp.org/library/evaluatn/fnl_eval.htm

• Evaluating community programs– http://ctb.lsi.ukans.edu/tools/EN/part_1010.htm

• Evaluation bibliography– http://www.ed.gov/about/offices/list/ope/fipse/biblio.html

• Higher Ed center evaluation resources– http://www.edc.org/hec/eval/links.html

• The Evaluation Center– http://www.wmich.edu/evalctr/

Page 34: What is program evaluation lecture   100207 [compatibility mode]

Websites

• American Evaluation Association (AEA)– http://www.eval.org/

• Southeast Evaluation Association (SEA)– http://www.bitbrothers.com/sea/

• Using logic models– http://edis.ifas.ufl.edu/WC041

• Resources for evaluators– http://www.luc.edu/faculty/eposava/resource.htm

• Various program evaluation publications (all pdf)– http://www.uwex.edu/ces/pdande/evaluation/evaldocs.html

• Evaluation toolkit – Kellogg Foundation– http://www.wkkf.org/Programming/Overview.aspx?CID=281

Page 35: What is program evaluation lecture   100207 [compatibility mode]

My Contact Information

Jennifer Ann Morrow, Ph.D.Assistant Professor of Evaluation, Statistics, and

Measurement

Department of Educational Psychology and Counseling

The University of TennesseeKnoxville, TN 37996

Email: [email protected] Phone: 865-974-6117