Evaluating Student Success Initiatives
Post on 20-Dec-2014
132 Views
Preview:
DESCRIPTION
Transcript
Making Sure Things Work Before We Scale Them Up
EVALUATING STUDENT SUCCESS INITIATIVES
Center for Applied Research at CPCC 2013
LACCDStudent Success & 3CSNSummit
PURPOSE
We want to take some time to discuss common misconceptions and issues experienced by colleges around the subject of evaluation.
We want to understand the differences between evaluation and research.
We want to know how to develop and implement a good evaluation for an intervention or program.
Center for Applied Research at CPCC 2013
What is evaluation?Evaluation is a profession composed of persons with varying interests, potentially encompassing but not limited to the evaluation of programs, products, personnel, policy, performance, proposals, technology, research, theory and even of evaluation itself.
Go to: http://www.eval.orgAt the bottom of the homepage there is a
link to a free training package and facilitators guide for teaching the Guiding Principles for Evaluator Training
Center for Applied Research at CPCC 2013
PROGRAM EVALUATION
As defined by the American Evaluation Association, evaluation involves assessing the strengths and weaknesses of programs, policies, personnel, products, and organizations to improve their effectiveness. Evaluation is the systematic collection and analysis of data
needed to make decisions, a process in which most well-run programs engage from the outset. Here are just some of the evaluation activities that are already likely to be incorporated into many programs or that can be added easily: Pinpointing the services needed for example, finding out
what knowledge, skills, attitudes, or behaviors a program should address
MORE ON EVALUATION
Center for Applied Research at CPCC 2013
Establishing program objectives and deciding the particular evidence (such as the specific knowledge, attitudes, or behavior) that will demonstrate that the objectives have been met. A key to successful evaluation is a set of clear, measurable, and realistic program objectives. If objectives are unrealistically optimistic or are not measurable, the program may not be able to demonstrate that it has been successful even if it has done a good job
Developing or selecting from among alternative program approaches for example, trying different curricula or policies and determining which ones best achieve the goals
CONTINUED
Center for Applied Research at CPCC 2013
Tracking program objectives for example, setting up a system that shows who gets services, how much service is delivered, how participants rate the services they receive, and which approaches are most readily adopted by staff
Trying out and assessing new program designs determining the extent to which a particular approach is being implemented faithfully by school or agency person
Center for Applied Research at CPCC 2013
CONTINUED
PurposeTo establish better products, personnel, programs, organizations, governments, consumers and the public interest; to contribute to informed decision making and more enlightened change; precipitating needed change; empowering all stakeholders by collecting data from them and engaging them in the evaluation process; and experiencing the excitement of new insights.
Evaluators aspire to construct and provide the best possible information that might bear on the value of whatever is being evaluated.
Center for Applied Research at CPCC 2013
PROGRAM EVALUATION
Definition of Evaluation
Study designed and conducted to assist some audience to assess an object’s merit
and worth. (Stufflebeam, 1999)
Identification of defensible criteria to determine an evaluation object’s value (worth or merit), quality, utility, effectiveness, or significance in relation to those criteria. (Fitzpatrick, Sanders & Worthen, 2004)
Center for Applied Research at CPCC 2013
Definition of Evaluation
Goal 2
Provide answers to significant evaluative questions that are posed
Goal 1Determine the merit or worth of an evaluand..
(Scriven 1991)
It is a value judgment based on defensible criteria
Center for Applied Research at CPCC 2013
Evaluation Questions
Provide the direction and foundation for the evaluation (without them the
evaluation will lack focus)
The evaluation’s focus will determine the questions asked.
Need Assessmen
tQuestions?
Process EvaluationQuestions?
Outcomes EvaluationQuestions?
Center for Applied Research at CPCC 2013
Process evaluation – determines if the processes are happening according to the planThe processes of a program are the “nitty-gritty” details or the “dosage” students, patients or clients receive – the activities
It is the who is going to do what and whenIt answers the question “Is this program being delivered as it was intended.”
Center for Applied Research at CPCC 2013
TYPES OF EVALUATION
Outcome evaluation (most critical piece for accreditation) determines how participants do on short-range, mid-range or long-range outcomes
Usually involves setting program goals and outcome objectives
Answers the question “is this program working” and/or “are participants accomplishing what we intended for them to accomplish”
Center for Applied Research at CPCC 2013
TYPES OF EVALUATION
Impact evaluation How did the results impact the student group, college, community, family (larger group over time)
Answers the question “Is this program having the impact it was intended to have (so you must start with intentions)?”
Center for Applied Research at CPCC 2013
TYPES OF EVALUATION
TWO MAJOR TYPES OF EVALUATION
Center for Applied Research at CPCC 2013
The Good News Is…..You are all data people
The Bad News Is….You are all data peopleSometimes have difficulty realizing this is not research and demands more than data from your student system
IR DEPARTMENTS
Center for Applied Research at CPCC 2013
Evaluation ResearchUse intended for use – use is the
rationaleproduces knowledge –
lets the natural process determine use
Questions
the decision-maker, not evaluator, comes up with the
questions to study.
the researcher determines the questions
Judgment
compares what is with what should be – does it meet
established criteria
studies what is
Setting action setting/priority is to the program, not the
evaluation
priority is to the research, not what is
being studied
Roles friction among evaluator’s roles and program giver’s
roles because ofjudgmental qualities of
evaluator.
not the friction; research vs. funder – no friction
Center for Applied Research at CPCC 2013
In Community
Colleges
ISSUES WITH EVALUATION
Center for Applied Research at CPCC 2013
The evaluated don’t take into consideration all factors including methodology and quality of implementation
College needs to have a realistic/courageous conversation on standards of evidence, statistical significance and expectations
Spend most of the time planning the interventions, not on how to evaluate it
Never define what success should look like, reasonable target
INTERVENTIONS HAVE QUESTIONABLE SUCCESS
Center for Applied Research at CPCC 2013
INTERVENTIONS ARE OFTEN TOO COMPLICATED
Multiple layers of independent variablesCollege lacks the staff, software or ability to
carry it out.Groups keep getting smaller and smaller
(for sample or comparison groups).Don’t really know what worked.Expansion happens too quickly.
Center for Applied Research at CPCC 2013
Not enough consideration of the costs of scaling
Don’t want to cancel plans involving un-scalable interventions (someone’s pet project)
Develop culture where it is ok to take risk and learn from mistakes
INTERVENTIONS HAVE QUESTIONABLE ABILITY TO BE ADAPTED ON A LARGE SCALE
Center for Applied Research at CPCC 2013
THE COLLEGE SKEPTIC
The one who wants everything to be statistically significant
The faculty group who wants to talk about confidence intervals or power
Fear that things won’t work“We tried that before”They confuse evaluation with research.
Center for Applied Research at CPCC 2013
LIMITED ABILITY TO EVALUATE.
Whole concept is new to many.Funders forces us to begin the process.
May be no one at the institution to lead them through it (health faculty are the best place to start).
Don’t know what resources are out there?
Center for Applied Research at CPCC 2013
ANALYSIS PARALYSIS
Let’s splice and dice the data more and more and more.
Too much data to analyzeDon’t know what it tells themHow do we make a decision about priorities and strategies from 200 pages of data tables?
Center for Applied Research at CPCC 2013
THE SUMMER HIATUS
Faculty leave in June and never give the initiative a thought until August 20th.
No interventions are in place when fall term begins
No evaluation tools are in place.Baseline data cannot be collected.August 20-31 they are mostly concerned with preparing for fall classes (as they should).
Center for Applied Research at CPCC 2013
NO WORKABLE EVALUATION TIMELINES
Creating a timeline.Identifying all the detail.Getting a team to actually follow it.Who is responsible for each piece.Where do completed surveys/assessments go – who scores them – who analyzes them – who makes decisions based on them?
Center for Applied Research at CPCC 2013
What does a logic model look like? Graphic display of boxes
and arrows; vertical or horizontal Relationships, linkages.
Any shape possible Circular, dynamic, Cultural adaptations, storyboards.
Level of detail Simple Complex
Multiple models
Source / Adapted from UW-Extension:
http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html
Center for Applied Research at CPCC 2013
A logic model is your program ROAD MAP
Where are you going?
How will you get there?
What will tell you that you’ve arrived?
Center for Applied Research at CPCC 2013
Family MembersBudget
Car
Camping Equipment
Drive to state park
Set up camp
Cook, play, talk, laugh, hike
Family members learn
about each other; family bonds; family
has a good time
Example: Every day logic model – Family Vacation
Source: E Taylor-Powell, University of Wisconsin- Extension-Cooperative Extension Center for Applied Research at CPCC 2013
INPUTS OUTPUTS OUTCOMES
Extension
invests time and
resources
We conduct a variety of educational activitiestargeted to individuals who participate
Participants gain knowledge, change practices and have improved financial well-being
Situation: Individuals with limited knowledge and skills in basic financial management are unable to meet their financial goals and manage money to meet their
needs.
WHAT WE INVEST
WHAT WE DO WHAT RESULTS
Example: Financial management program
Source: E Taylor-Powell, University of Wisconsin- Extension-Cooperative Extension
Center for Applied Research at CPCC 2013
Staff
Money
Partners
Develop parent ed curriculum
Deliver series of interactivesessions
Parents increase knowledge of child dev
Parents better understanding their own parenting style
Parents use effective parenting practices
Improved child-parent relations
Research
INPUTS OUTPUTS OUTCOMES
Facilitate support groups
Parents gain skills in effective parenting practices
Example: One component of a comprehensive parent education and
support initiative
Parents identify appropriate actions to take
Strong families
Situation:
Targetedparentsattend
Assumptions: External factors:
During a county needs assessment, majority of parents reported that they were having difficulty parenting and felt stressed as a result
Center for Applied Research at CPCC 2013
Coalition
Time
Dollars
PartnersIncluding youth
Assess worksite tobacco policies and practices
Organize and implement strategy for targeted worksites
Worksite owners, managers
Develop community support for SF worksites
Workers; union members
Public
Increased knowledge of SF worksite benefits & options
Demonstrations of public support for SF worksites
SF worksites
Increased commitment, support and demand for SF worksites
Unions
Adherence to smoke-free policies
Inputs Outcomes
Example: Smoke free worksitesSituation: Secondhand smoke is responsible for lung cancer, respiratory symptoms,
cardiovascular disease, and worsens asthma. Public policy change that creates smoke free environments is the best known way to reduce and prevent smoking.
Increased awareness of importance of SF worksites
SF worksites policies drafted
SF worksite policies passed
Outputs
Source: E Taylor-Powell, University of Wisconsin- Extension-Cooperative ExtensionCenter for Applied Research at CPCC 2013
INPUTINPUT
What resources are needed for starting this intervention strategy?
How many staff members are needed?
Need AssessmentQuestions?
PROCESSPROCESS
Process EvaluationQuestions?
Is the intervention strategy being implemented as intended?
Are participants being reached as intended?
OUTCOMESOUTCOMES
Outcomes EvaluationQuestions?
To what extent are desired changes occurring? For whom?
Is the intervention strategy making a difference?
What seems to work? Not work?
Source: R. Rincones-Gomez, 2009Center for Applied Research at CPCC 2013
CHAIN OF OUTCOMES
SHORT MEDIUM LONG-TERM
Seniors increaseknowledge of food contamination risks
Practice safe cooling of food; food preparation guidelines
Lowered incidence of food borne illness
Participants increaseknowledge and skills infinancial management
Establish financial goals,use spending plan
Reduced debt andincreased savings
Community increasesunderstanding ofchildcare needs
Residents and employersdiscuss options andimplement a plan
Child care needs are met
Empty inner city parkinglot converted tocommunity garden
Youth and adults learngardening skills, nutrition,food preparation and mgt.
Money saved, nutritionimproved, residents enjoygreater sense ofcommunity
Source: E Taylor-Powell, University of Wisconsin- Extension-Cooperative ExtensionCenter for Applied Research at CPCC 2013
Supplemental InstructionLearning CommunitiesRequired OrientationAcademic Success CourseMinority Male MentoringDevelopmental Math RedesignPeer TutoringAccelerated English
WHAT ARE THE SUMMATIVE AND FORMATIVE OUTCOME INDICATORS
Center for Applied Research at CPCC 2013
Select an ATD student success initiative at your college that you plan to evaluate before you make the decision to scale it up. (if you can’t think of one use the online learning one in your handouts)
Use this program for each activity.
AT YOUR TABLES ……….
Center for Applied Research at CPCC 2013
Ask them to answer these question:
1. Why did you develop this program with these program characteristics?
2. What do you think students (or participants) will get out of this program (what changes)?
3. How do you tie specific program content to specific expected changes or improvements in participants.
1. BRING TOGETHER THE PROGRAM DEVELOPERS
Center for Applied Research at CPCC 2013
Who should be on it?What skills do you need at the table (what staff members have those?)
What should be their charge?
2. ORIENT AN EVALUATION TEAM
Center for Applied Research at CPCC 2013
What are potential sources for outcomes?
3. GATHER INFORMATION ON POTENTIAL OUTCOMES.
Center for Applied Research at CPCC 2013
Sometime these are already written (from grants)
Make them clearDon’t draw a number out of a hatTest it outCreate a logic model
4. WRITE OUTCOME STATEMENTS
Center for Applied Research at CPCC 2013
Outcome Indicator. - Usually referred to as a key performance indicator, this is the data, or set of statistics that best verifies the accomplishment of a specific outcome. An outcome indicator for college readiness might be an SAT score of 1100 or above. It is typically the accomplishment of a specific skill or assessment at a certain level that indicates an outcome is met.
What data can you access?What assessments need to be selected?
5. CREATE OUTCOME INDICATORS
Center for Applied Research at CPCC 2013
Outcome Target – the benchmark set as a performance indicator for a given outcome. An example would be that 80% of students would score a 75% or above on a reading assessment. The outcome target would be “80% of students.”
How would you create these targets or benchmark?
Do you need a comparison group?What is an acceptable level of improvement or change?
6. CREATE OUTCOME TARGETS
Center for Applied Research at CPCC 2013
You will probably need:Demographic sheetsAttendance or participation logFormative evaluation tools
Will they be online or pencil/paper tools (benefits of each)
When do they need to be ready?Who needs copies?Create evaluation timeline.
7. CREATE ALL TOOLS
Center for Applied Research at CPCC 2013
Make sure it worksGive a small group of student or faculty/staff
the assessments to make sure they are clearWork out all the detail
Who distributes itWho collects itWho scores itWho puts it in the spreadsheetWho keeps up with the post-test dates, etc.
8. PILOT TEST THE PROCESS
Center for Applied Research at CPCC 2013
Follow your plan
9. IMPLEMENT THE EVALUATION
Center for Applied Research at CPCC 2013
Sometimes just numbers and percents
Sometimes statistical tests are needed
If students don’t meet the summative evaluation benchmarks, analyze the formative evaluation
10. ANALYZE RESULTS
Center for Applied Research at CPCC 2013
Takes several years to have good data.
Discuss how the evaluation can be improved
Discuss how the program can be improved
11. IMPROVE YOUR PROCESS AND PROGRAM
Center for Applied Research at CPCC 2013
Establish your planFollow your planAssign responsibility for itExpect big thingsUse results to improve what you do (close the loop)
CLOSING
Center for Applied Research at CPCC 2013
SUPPORT AND CONTACT INFO:
Terri Manning, Ed.D.terri.manning@cpcc.e
du(704) 330-6592
top related