A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University
A Model For Evaluating Institutional Research Functions
AIR 2000May 20, 2000Frank DohertyDirector of Institutional ResearchJames Madison University
Objectives
Learn how to describe what you do.Develop a systematic plan to
evaluate the IR functions.Freebie: Learn a method to evaluate
other administrative offices on your campus.
Schedule
8:00—10:00 Introduction and program design development
10:00 Break
10:15—12:00
Program design development
12:00—1:00 Lunch
1:00—4:00 Evaluation design development
4:00-5:00 Wrap-up and evaluation
JMU OIR Evaluation
1992 SACS Visiting Team Report “Although OPA (now OIR) has occasionally
evaluated the usefulness of some of its products and services, evaluation has not been established as a routine matter. Thus, the Committee recommends that the University establish regular and ongoing evaluation mechanisms for the institutional research function.”
Evaluation Is User-Oriented
Objective is program improvement and accountability
Seek information that will improve office
User control of evaluation is very important
Elements of the Evaluation
Program designEvaluation designProgram review teamData collection and analysisReporting and recommendationsImprovement planOngoing evaluation
Program Design Philosophy
You cannot evaluate that which you cannot describe
First step in self-studyFacilitates clarification of program goals
and operation—wonderful communication device
Aids the planning processServes as an implementation guideProvides a sense of the wholeDocuments program operation
Program Design
Discrepancy Evaluation Model Systems approach
InputsProcessesOutputs
Compare performance with standard (gap analysis)
OIR Program Design Network Input-Process-Output statements
Evaluation Plan Philosophy
States intentions publiclyOrganizes complexity of evaluation
effortFacilitates and justifies evaluation
resource allocation decisionsServes as a “standard” for judging
an evaluation effort
Evaluation Design
Overall plan for the evaluation Concerns/Issues
Program-specific Common
Questions Information sources/methodology
OIR Evaluation Design
Program Review Team
Consists of 8-10 staff recommended by office
Chair not from office, but appointed by division head
Collect dataWrite report and recommendationsRecommendations discussed with division
head and supervisorAnnual objectives developed to address
recommendations
Ongoing Program Review
OIR program review is conducted every three years
Online surveyAccountability and use of results
Annual objectives based on recommendations
Components
Components Of An Effective Program Review at James Madison University
Summary
IR evaluation should be: Thorough User-oriented On-going and accountable
OIR evaluation report http://www.jmu.edu/instresrch/present/
air99/oireval.pdf
Program Design Exercise
Program design consists of two parts Network Input-Process-Output statements
Network
Numbering and levelsFunctional dependenciesLet’s create a Network
IPO Statements
InputsProcessesOutputsJMU OIR Network
Inputs
Things which set processes into motion and keep them running resources receptors staff independent groups/organizations preconditions enabling outputs from other components
Processes
Described as event-sequencesProcess descriptions of
intended interactions of people, materials and media, and current context in which they take place
Be specific indicate who is doing what to whom, how,
when, where, and for how longLinked to outputs
Outputs—Terminal
Two types: terminal objectives and enabling objectives
Terminal objectives are changes or products which result from program-controlled processes intended to be fed into the external
environment outputs for which the program holds
itself accountable—bottom line
Outputs—Enabling
Enabling objectives result from program-controlled processes used within program rather than
without “enables” the achievement of terminal
objectives can be output of one process and input
into another
IPO Development Exercise
Let’s develop an IPO for your office.IPO Exercise
Evaluation Plan
Address primary needs of area “What do you need to know?”
May want to address common institutional issues and questions Customer satisfaction Planning Use of results Etc.
Stages of Evaluation
Design evaluationInput evaluation *Process evaluation *Output evaluation *Cost-Benefit Analysis
Design Evaluation
Assessment of substantive adequacy of a program’s design
Is this likely to be a good program?Examination of the substance,
assumptions, and structure of a program prior to installation.
Input Evaluation
Appropriate for: New programs and replication efforts
Installation evaluation Inputs are present as prescribed by
program design Planned processes have been set in
motionDesign preconditions have been metStipulated preconditions are criticalFiscal monitoring
Process Evaluation
Monitors continued operation and sequential accomplishment of enabling objectives
Formative: Discrepancy reports used to modify and improve program operations
Process Evaluation
Sets the stage for summative evaluation Documents and defines “treatment”
until program process stable Clarifies relationship between program
activities and accomplishment of interim objectives
Evaluation plans should emphasize process evaluation Particularly useful during early stages of
program operation
Output Evaluation
Refers to terminal objectives only Have terminal objectives been
achieved?Investigation of causationMost useful when preceded by
formative evaluation Previous evaluation stages
contribute to program stability and improvement
Evaluation Design
Components Description
Evaluation Concern 3-7 aspects of program to be evaluated
Evaluation Questions 2 or more performance questions for each concern
Design Referent Refers to program design
Information Needed Reason for question, kind of information sought
Source of Information Where information comes from and how to be analyzed
Date Information Needed When discrepancy info needed
Selection Criteria
Critical functional importanceAreas that are problematicAreas of direct concern to external
evaluation audiences (i.e. accrediting agency)
Areas of concern to internal evaluation audiences (customer satisfaction)
Areas where information is needed soon
Evaluation Concerns Identification
Common models of organization By design component By cross-cutting function By evaluation stage
Evaluation Questions
Derived from a larger area of concernGuide to collection of performance
informationWhat kind of performance information is
necessary to answer questions posed?Determine standard for each variable
identified
Evaluation Questions
Develop for each evaluation concernShould direct systematic collection of
performance informationEvaluation question directs one to
performance information
Design Referents
Relate program design to evaluation design.
Design referent should point to a component in the program design and indicate whether the question is related to input, process, or output.
Information Needs
Provides rationale for each questionExplains what kind of information
soughtIndicates how collected information
will be used, and by whom.
Information Needs Justification
Record keepingRoutine monitoringVerification of preconditionsManagement troubleshootingFunctional criticalityAccountabilityBargain information
Information Needs Continued
Information need should tell reader
purpose of the information
(F) for formative
(S) for summative
Sometimes can be F and S
Sources of Information
Task #1: Brainstorm information possibilities for each question
Task #2: Pick and choose from possibilities
Factors to consider: Reliability and validity Cost (time and resources)
Report Dates
Establish ballpark estimate when
discrepancy reports should be
available
May differ from audience to audience
Evaluation Design Exercise
Let’s create an evaluation design Evaluation Concerns Evaluation Questions Design Referent Information Needed Source of Information Date Information Needed
Data Analysis
Questions determine methodsMultiple methods used
Statistical analysis of data Document review Surveys Interviews Focus groups
Reporting and Recommendations
Reports are organized by evaluation issue/concern
Self-Study team develops recommendations