PURDUE UNIVERSITY GRADUATE SCHOOL Thesis Acceptance This is to certify that the thesis prepared By Entitled Complies with University regulations and meets the standards of the Graduate School for originality and quality For the degree of Final examining committee members , Chair Approved by Major Professor(s): Approved by Head of Graduate Program: Date of Graduate Program Head's Approval: Margaret L. Dalrymple The Value of Evaluation: A Case Study of Evaluating Strategic Plan Initiatives Doctor of Philosophy Judith M. Gappa Deborah E. Bennett, Co-Chair Rabrindra N. Mukerjea Sidney M. Moon Andrea M. Trice July 23, 2007 Judith M. Gappa Kevin R. Kelly Graduate School ETD Form 9 (01/07)
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
PURDUE UNIVERSITYGRADUATE SCHOOL
Thesis Acceptance
This is to certify that the thesis prepared
By
Entitled
Complies with University regulations and meets the standards of the Graduate School for originality
and quality
For the degree of
Final examining committee members
, Chair
Approved by Major Professor(s):
Approved by Head of Graduate Program:
Date of Graduate Program Head's Approval:
Margaret L. Dalrymple
The Value of Evaluation: A Case Study of Evaluating Strategic Plan Initiatives
Doctor of Philosophy
Judith M. Gappa Deborah E. Bennett, Co-Chair
Rabrindra N. Mukerjea Sidney M. Moon
Andrea M. Trice
July 23, 2007
Judith M. Gappa
Kevin R. Kelly
Graduate School ETD Form 9(01/07)
THE VALUE OF EVALUATION:
A CASE STUDY OF EVALUATING STRATEGIC PLAN INITIATIVES
A Dissertation
Submitted to the Faculty
of
Purdue University
by
Margaret L. Dalrymple
In Partial Fulfillment of the
Requirements for the Degree
of
Doctor of Philosophy
August 2007
Purdue University
West Lafayette, Indiana
UMI Number: 3304589
33045892008
UMI MicroformCopyright
All rights reserved. This microform edition is protected against unauthorized copying under Title 17, United States Code.
ProQuest Information and Learning Company 300 North Zeeb Road
P.O. Box 1346 Ann Arbor, MI 48106-1346
by ProQuest Information and Learning Company.
ii
For my parents,
Ogden and Janet Dalrymple
iii
ACKNOWLEDGMENTS
I owe a great number of people my sincere gratitude for their support,
encouragement, and assistance. I owe the greatest of professional debts to my advisor,
Dr. Judith Gappa, whose sound advice, erudite guidance, and solid encouragement kept
me on my path to completing this work. I was very lucky to have such a superb
dissertation committee, each member of which served as an excellent mentor, each giving
exceptional guidance. Special thanks go to Professor Rabindra Mukerjea, Dr. Deborah
Bennett, Dr. Andrea Trice, and Dr. Sidney Moon for their contributions to this work.
I would also like to extend my appreciation to all of the participants at the
universities whose strategic plans this study considers. Although they will not be named
in order to preserve confidentiality, their efforts in assisting me with this research are by
no means forgotten. I am grateful for their time and wisdom. I could not have completed
this research without the expert assistance of colleagues whose guidance lent polish to the
final product: Dr. Carol Kominski, Mr. Darrin Harris, Dr. Victor Borden, Ms. Julie
Carpenter-Hubin, Dr. Michael Dooris, and Dr. Louise Sandmeyer.
I thank my doctoral cohorts, Dr. Andrew Robison, Dr. Sara Stein Koch, Dr. Steve
Wanger, and others for their support, collaboration, and friendship over the course of
these past few years.
iv
Dr. Jacquelyn Frost deserves special thanks for her professional guidance,
mentorship, and friendship throughout this entire journey. Her steadfast support has kept
me motivated, and I will forever be grateful for her faith and trust in me.
I am also deeply appreciative to Esther Weiner for her support at a critical point
in my life, and to my sister Georgina Dalrymple for her timely childcare assistance and
constant encouragement.
Last, but by no means least, I thank my family; they are my greatest supporters.
Thank you to my children for their incredible patience and ability to keep me grounded.
They may be small in size, but their unceasing love has been a source of overwhelming
strength. To my husband, Steven, I owe the greatest debt. This accomplishment would
never have come to pass without his love, guidance, and understanding. Thank you.
Dalrymple, Margaret L. Ph.D., Purdue University, August 2007. The Value of Evaluation: A Case Study of Evaluating Strategic Plan Initiatives. Major Professor: Judith M. Gappa
The current environment of oversight and accountability, along with declining
state support for higher education, have prompted many universities to utilize strategic
planning in order to achieve their objectives. Yet strangely, relatively little research has
been done on evaluation models for assessing strategic planning in higher education
institutions. This study endeavors to shed some much-needed light on how higher
education institutions have assessed their university-wide strategic plans by focusing on
their evaluation methods.
This study focuses on the methodology of various approaches, and does not
determine whether those strategic plans have been effective or not. Rather, the
dissertation examines the methods used in collecting data and evidence during the
evaluation phase, the final stage of strategic planning.
The dissertation follows the design of a multiple case study, the cases selected
being higher education institutions located within the United States, affiliated with the
Association of American Universities (AAU), and in the initial stage of the evaluation of
their university-wide strategic plans. Within the multiple cases, the study’s data (coded
x
using a logic model) is derived from detailed document analysis, a survey, and
interviews. Results of , the study reveal several key factors in a systematic evaluation of
strategic planning initiatives: communication, leadership, culture, and integration of
budgeting and planning.
Exemplar characteristics drawn from the study’s findings result in a conceptual
evaluation model. This model has, five elements: a summarization and exploration of the
relational issues of influences and resources; a review of activities and processes; an
analysis of data; an assessment of the consequence, outcomes, and/or the effectiveness of
the plan; and stakeholder participation in the evaluation. These elements provide the basis
for an evaluation methodology that a higher education institution can apply to its strategic
plan initiative. The conceptual evaluation model may thus provide academic practitioners
with a comprehensive tool with which to evaluate their own institutions’ strategic plans.
The findings will potentially benefit academic leaders, strategic planners, and
institutional researchers by identifying the critical components of evaluation of a
university-wide strategic plan.
1
CHAPTER I: INTRODUCTION
“People commonly use statistics like a drunk uses a lamp post: for support rather
than for illumination.” Though aimed squarely at the politicians of his time, Andrew
Lang’s witty observation continues to resonate. Indeed, one might even apply it to
today’s higher education institutions when it comes to evaluating their strategic plans.
For even though the purpose of strategic planning is to assist an institution in achieving
its vision, it is nevertheless true that some institutions tend to use data to support their
initiatives rather than to evaluate the actual merit of their plans.
In this day and age of assessment and accreditation, much has been written about
the need to measure the effectiveness of a higher education institution. What better way
to do this than to look at the level of success of an institution’s strategic plan? Ironically,
though, evaluations of a strategic plan’s success (or failure) are not often undertaken.
Perhaps it is because this type of analysis seems overwhelming and difficult to
conceptualize, or because no standard techniques of evaluation have yet been devised. Or
perhaps it is because strategic plans are so complex since institutions employ them to take
into consideration their unique environmental, fiscal, governance, geographic, and
demographic issues. However, assessment of a strategic plan is absolutely necessary in
order to address the question of how effective it has been.
2
The desire for oversight and accountability combined with declining government
support for higher education has created an environment that demands that institutions
employ effective strategic plans to achieve their mandated objectives. But therein lies the
greatest challenge: in such an environment of accountability, institutions must be able to
showcase (or own up to) the actual level of effectiveness or success they have achieved in
implementing their strategic plans. While developing a strategic plan is relatively
commonplace now, evaluating whether it is truly effective is far less common, and this
failure has even derailed several institution’s overall plans (Mintzberg, 1994;
Schmidtlein, 1990). The challenge thus lies in moving from the implementation of a
strategic plan to an evaluation of it that is detailed, inclusive, and responsive, and that can
also inform institutional budgeting and marketing. This apparent lack of evaluation
demonstrates the need for assessment of strategic plans in higher education institutions.
This chapter begins by describing the background and rationale for this study. It
then presents the research study and defines key terms. Next an overview of the
methodology is presented. The chapter concludes with the purpose and significance of
the study as well as its limitations.
Background of the Study: Overview of Strategic Planning in Higher Education
It is no coincidence that higher education institutions have embraced strategic
planning at the same time that a number of changes have occurred in funding, staffing
and enrollment. Both the number of students and faculty rose dramatically in the last half
of the 20th century (Keller, 1983; Nunez, 2003). Following suit, physical facilities have
3
expanded as well. More recently, changes in traditional funding sources and poor
economic conditions have required institutions to look both internally and externally for
new sources of funding in order to address these changing conditions (Nunez, 2003, p. 1).
The most widely used approach to deal with such new circumstances is strategic planning
(Glaister & Falshaw, 1999; Nunez, 2003).
Brief History of Strategic Planning. The etymology of strategy is from the
Greek words of stratos (army) and agein (to lead). Strategic planning originated in the
military, which defined it “as the science of planning and directing large-scale military
operations, of maneuvering forces into the most advantageous position prior to actual
engagement with the enemy” (Torres, 2001, p. 15). However, it is in the private sector
that the concept has come into its own in the last several decades. It evolved there as a
planning model for defining the purpose and policies of a company and its business.
Eventually it was modified into management of risk, industry growth, and market share
(Torres, 2001). Early studies on strategic planning in the private sector focused on
manufacturing, banking, and transportation industries (Anderson, 2000). In 1973, Rue
conducted a study to empirically evaluate the planning practices and attitudes of
practitioners in private sector organizations. The results revealed that the majority of
firms engaged in moderate to extensive strategic planning activities and few firms had no
planning processes. Rue (1973) emphasized that strategic planning had grown in
popularity and size in the early 1970s, as had the need to involve staff specialists,
planning departments, and outside consultants. More recent research regarding the
4
current status of strategic planning is limited. In one recent study, Glaister and Falshaw
(1999) examined the extent to which private firms used strategic planning and their views
and attitudes toward it. Glaister and Falshaw asserted that, while strategic planning
suffered a downturn in popularity in the late 1970s due in large part “to the inability of
strategic planning tools to deliver what was expected of them,” strategic planning had
regained its reputation by the 1990s (1999, p.108). This was due to its emphasis on
resource management, which bolstered strong performance in many firms through
resource acquisition.
Strategic Planning in Higher Education. Strategic planning started to be
accepted in higher educational institutions in the 1980s. The 1983 publication of George
Keller’s book Academic Strategy: The Management Revolution in American Higher
Education provides a reference point in time as to when institutions began to take a closer
look at strategic planning. Keller (1983) suggested that the following four internal and
external factors encouraged institutions to adopt planning:
1. Changing external conditions, including changing demography.
2. Demands for accountability from outside agencies.
3. Decreasing financial strength and increasing operational costs, coupled with
diminishing financial support from the government.
4. Progress in information technology.
The trends that Keller outlined continue today. The drive to measure institutional
efficiency and performance has gained dramatic momentum in response to national
5
economic needs and new governmental demands (Alexander, 2000; Epper & Russell,
1996; Ewell, 2004; McLeod & Cotton, 1998; Nunez, 2003). Epper and Russell (1996)
reviewed 20 years’ worth of data in state coordinating and governing boards of higher
education. In their review they found within the state appropriations to higher education
the emerging issues of effectiveness and accountability of institutions (Epper & Russell,
1996).
The adoption of the idea of strategic planning did not become commonplace in
higher education institutions until the late 1990s. Most of those same institutions,
however, failed to follow through with the whole cycle, and therefore missed many of its
potential benefits. Instead, they ended up with a large report gathering dust on the shelf
viewed as yet another management fad that created a lot of busy work. Those familiar
with that type of strategic planning experience (as well as some others) have been
understandably skeptical of the effectiveness of strategic planning. Nevertheless, it is still
seen as necessary and desirable, especially in the context of today’s financial climate
Kotler & Murphy, 1981; Rowley, Lujan, & Dolence, 1997), but for the purpose of this
study it is defined in four basic phases:
21
Phase I – planning (environmental assessment, clarification of the institutional
mission, vision, goals, strategies/objectives, and measurements)
Phase II - resource development (internal assessment)
Phase III – implementation (coordinated unit plans and data analysis)
Phase IV – assessment (outcomes, priorities, and identifying future strategic
directions).
Phase I: Planning. The strategic planning model is best known for its planning
phase. Much has been written on this phase of the model (Howell, 2000; Keller, 1983;
Kotler & Murphy, 1981; Rowley, Lujan, & Dolence, 1997). The planning phase can be
difficult for some to understand due to the inherent differences between traditional
planning models and strategic planning. Rowley, Lujan, and Dolence (1997) have
outlined key distinctions between traditional planning and strategic planning, drawing
special attention to the important difference of focus. Strategic planning is externally
focused whereas the traditional approach has an internal emphasis. Other differences
include:
1. Alignment - traditional planning sets goals and objectives to achieve these
goals, whereas strategic planning aligns the organization with its environment;
2. Specificity versus direction - strategic planning is less specific and puts
emphasis on broad directions rather than exact targets;
22
3. Focus - traditional planning is parochial and gives attention to units within an
organization, whereas strategic planning keeps its focus on the organization as
a whole; and
4. Time-relatedness - strategic planning is an ongoing process, whereas
traditional approaches are episodic (Rowley, Lujan, & Dolence, 1997).
One difficulty in adopting strategic planning is that institutions often interpret
strategic planning as one uniformly defined planning process. Schmidtlein (1990)
believed that comprehensive planning approaches frequently resulted in frustration and
disappointment when formulaic plans were followed without taking into consideration
the distinctive cultures of various institutions. Institutions with successful strategic plans
have modified the processes to best fit their needs. “Instead of uncritically adopting a
particular planning approach, they often engage in a broad range of activities – both in
their attempts to address the potential impact of changing conditions on their mission and
operations and in developing strategies to address these impacts” (Schmidtlein, 1990, p.
1). Institutions with successful plans thus based their planning approach on their unique
institutional characteristics.
The traditional corporate version of strategic planning can be modified to fit the
distinctive power structures of academe. While academic issues usually lie within the
purview of the faculty, administrators have responsibility for the budget, for complying
with legal and managerial requirements, and for dealing with external demands.
Consequently, implementing academic decisions, especially within the strategic planning
process, requires consultation and consensus among faculty and administrators. Faculty
involvement and leadership in planning that deals with academic issues is clearly an
23
important and critical element, since academic matters are unlikely to be implemented if
they lack faculty support. Given the central role faculty play in making academic
decisions, fostering faculty initiative is an important condition for successful planning
(Schmidtlein, 1990).
Rowley, Lujan, and Dolence (1997) suggest that when an institution begins to
plan strategically, a SWOT analysis should be undertaken. A SWOT analysis involves
reviewing an organization’s internal Strengths and Weaknesses and external
Opportunities and Threats. The analysis of external opportunities and threats is essential
to understanding the local culture (i.e. the immediate educational, economic,
demographic, physical, political, social, and technological environment). The internal
strengths and weaknesses should include an assessment of campus roles, values, cultures,
mission, and vision. The vision of the institution centers on its role within the higher
educational system and society. The institution’s mission communicates its general
character, defines appropriate programs and activities, and provides future strategic
direction (Schmidtlein, 1990). Also during this phase Schmidtlein recommends that a
strategic planning committee be appointed to develop the planning document, the time
lines, the roles of all the various stakeholders, and setting criteria to evaluate the plan. As
all of this requires information to be shared among the campus community, the
importance of communication throughout the planning process can hardly be
overemphasized.
It is clear that strategic plans often fail to move beyond the planning phase. Most
failures stem from using comprehensive processes unsuited for the institution or from a
lack of internal political support (Schmidtlein, 1990; Taylor & Karr, 1999). Part of the
24
problems with planning can be the size and complexity of the process – the overlapping
committees, the paperwork, and the process slow pace (Dooris, Kelley, & Trainer, 2004;
Taylor & Karr, 1999). Therefore, in order to accomplish strategic planning, it is
important to move beyond planning and determine how to measure progress.
Components of a Strategic Plan. To support the vision and mission, measurable
goals and strategies must be well defined. Goals provide the overarching aim. Strategies
or objectives are the assessable statements supporting the goal, clearly defining what it is
the institution wants to accomplish. In order to determine success or failure, the
objectives should be clearly stated (Cordeiro & Vaidya, 2002, p.30). Measurements
provide direction (Rieley, 1997a; Atkinson, Waterhouse, & Wells, 1997); as such, they
articulate the strategic plan’s goals and strategies/objectives. (These may also be referred
to as metrics, benchmarks, or key performance indicators.) Measurements allow an
institution to track how well it is actually doing in its strategic plan effort. Measurements
enable an institution to better understand what creates success, and to effectively manage
processes to achieve that success (Atkinson, Waterhouse, & Wells, 1997). Measurements
are internally focused measures that monitor the health, effectiveness, and efficiency of
the organization consistent with the goals and objectives established by the strategic plan
(Cordeiro & Vaidya, 2002; Rowley, Lujan, and Dolence, 1997). Ultimately, the
measurements relate back to the institution’s mission.
While each college or university determines its own measurements, there are
consistent types of data that can be used across the board. The general categories and
25
measures used on many campuses divide the data into input measures and outcome
measures. Subcategories of information regarding students, faculty, staff, learning, and
university resources are found under these input and outcome headings. Typical
measurements include admission standards, enrollment, financial aid, retention of
students, graduation rates, degrees awarded, faculty productivity, research initiatives,
alumni and community relations. Input measures usually focus on the characteristics of
the students and faculty, physical plant, and the evaluation of resource allocations.
Outcome measures are typically focused on student outcomes, teaching, and faculty
research. It is important to note that certain critical elements are sometimes lacking, such
as information on student learning assessment practices, technology in the classroom,
interdisciplinary programs, administrative operational efficiency, and the impact of
university or faculty research (Ray, 1998). Often this is due to the limited availability of
data to track those particular initiatives.
These measurements are used to provide a baseline of key data points, as well as
to act as markers of improvement during the life of the plan. Often the term
“benchmarks” is used to describe measurements that are comparisons of fundamental
data items among peer institutions. Using the peer institutions, the benchmarks act as a
reference point for best practices (Anderes, 1999, p.119). Generally, peer institutions
listed are considered to be aspirational, as in possessing various qualities or strengths not
presently part of the institution developing the strategic plan. The measurements can be
qualitative or quantitative, and play a major role in the strategic plan.
The following a hypothetical example may serve to illustrate a strategic plan’s
key components:
26
Mission The core elements of our institution are: Becoming a national leader in the quality of our academic programs; being recognized for the quality of the learning experience for our students; creating an environment that respects and nurtures all members of the community; and expanding the land-grant mission to meet the 21st century needs.
Vision To be one of the nation’s best universities in integrating teaching, research, and service.
Goals Create a diverse university community.
Strategies Increase the number of women and minority faculty at a senior level each year for five years through hiring assistance initiatives.
Measure The annual number of full-time women and minority faculty for each year of the strategic plan initiative.
Figure 2.1: Strategic Planning Key Components
Phase II: Resource Development. Strategic planning articulates the institution’s
mission. The consensus on a mission creates a set of shared assumptions about the future
that serves as the basis for budget decisions. The strategic plan is a comprehensive
document, and thus not only provides future directions, but also encompasses current
practices in the total university. Because the budget is a collective document as well, it
should be synchronized with the strategic plan.
A significant part of the strategic planning process requires that resources be
identified in the campus budget to implement the plan (Cordeiro & Vaidya, 2002;
Dickmeyer, 2004; Prager, McCarthy & Sealy, 2002; Rieley, 1997a). It is at this point that
many institutions, while embracing the basic connection between the strategic plan and
the budget, nevertheless fail to adequately coordinate the two. Often budgetary decisions
are “administratively expedient under the pressure of budget requirements, but not
necessarily cognizant of priorities established by institutional planning” (Anderes, 1996,
27
p. 129). Institutions clearly need to consider the programs and resources that will be
required in order to move in desired future directions (Schmidtlein, 1990). As objectives
are developed, resources should be assigned to accomplish each objective. The plan that
is not connected directly to the budget process could become a disjointed, frustrated
effort with minimal success and no long-term gains (Rieley, 1997a; Taylor & Karr,
1999).
Anderes (1996) provides many reasons to link the two. First and foremost,
budgets can act as basic extensions of planned priorities, implemented within the general
intent of the strategic plan. The budget can likewise legitimize the plan. Often the success
of a plan is measured by whether or not its objectives and priorities have been included in
the budget development process, and whether or not they received adequate funding. By
coordinating the plan with the budget, strategic planning priorities are identified and
evidence of achievement can be established, thereby reducing the likelihood of decisions
being made outside the strategic plan’s priorities or simply to satisfy stakeholder
expectations. Most of all, it provides greater confidence in the institutional direction
when a coherent strategic plan supports the budget (Anderes, 1996).
Anderes (1996) recommends that the integration of the budget with the strategic
plan be in the earliest stages of the process. Some other key conditions for effective
implementation of the two include active leadership, broad participation by key internal
and external stakeholders, a clear intention to integrate planning priorities into budget
development and allocations of funding, and the understanding of the need for
assessment, which includes the tracking of funding (Anderes, 1996).
28
Anderes (1996) suggests that when the budget is coordinated with the strategic
plan, all units are able to justify their budget requests so that they relate to strategic plan
initiatives. The credibility of a budget request can be judged on how well it represents
strategic planning priorities. Planning priorities are legitimized by the allocation of
resources and their feasibility is usually based on their ability to garner funding. The
clearest indicator in determining whether planning and budgeting are connected is the
degree to which planning priorities are ultimately funded.
Resource development is a significant part of the strategic planning process. It is
necessary to evaluate the resources needed for achievement, allocate funding for new
ideas, and attend to the financial implications. Coordination between the budget and
strategic plan can help achieve the objectives, promote collaboration and partnership
among units, and provide a means of measuring progress (Anderes, 1996; Prager,
McCarthy & Sealy, 2002; Rieley, 1997a). The implementation of the strategic plan relies
upon identifying resources that can be allocated within the budget through incremental
changes.
Phase III: Implementation. Leadership is key in the implementation phase
Provides information that helps you improve your program. Generates periodic reports. Information can be shared quickly.
Generates information that can be used to demonstrate the results of the program to stakeholders and the community.
Focuses most on program activities, outputs, and short-term outcomes for the purpose of monitoring progress and making mid-course corrections when needed.
Focuses most on program’s long-term outcomes and impact. Although data may be collected throughout the program the purpose is to determine the value and worth of the program based on the results.
Helpful in bringing attention to suggestions for improvement.
Helpful in describing the quality and effectiveness of your program by documenting its impact on stakeholders and the community.
One disadvantage with case studies is that some see the method as not scientific,
i.e. nonrandom sample with non-generalizable conclusions. Other drawbacks may be that
the researcher may be biased and/or the subjects are usually aware that they are being
studied. However, case studies are designed to bring out the details from the viewpoint of
the participants by using multiple sources of data (Tellis, 1997). The overall nature of this
research or phenomenon lends itself to qualitative research; to investigate, to explain, to
document, and to predict the evaluation methodology of strategic plan initiatives at public
research institutions.
63
Using a Logic Model to Frame the Evaluation
A logic model illustrates the purpose and content of a program and makes it easier
to develop meaningful evaluation questions from a variety of vantage points: content,
implementation, and results (W.K. Kellogg Foundation, 2004). It helps identify areas of
strength and/or weakness, thereby enriching the assessment of the program. It points out
what activities need to be monitored and what kind of measurements might indicate
progress toward results. Using the logic model in this study’s methodology therefore
provided for a more comprehensive evaluation process.
Specifically, the logic model guided the data collection and analysis. The surveys
and interviews questions were coordinated to the components of the logic model. The
questions were structured to provide data that were matched to each component of the
logic model (see Appendices C and G). As a result of the data being synchronized to the
logic model components, the analysis provided an examination of the institutions’
evaluation methodologies to see if they assessed their entire strategic plan processes. This
ensured that this study’s review incorporated the entire strategic plan process regardless if
the institution’s own evaluation of its strategic plan initiative did or did not. Using a logic
model to frame the assessment thereby produced a more holistic approach for the
evaluation of the cases.
Overall Procedure
This multiple case study design contains more than a single case, in other words,
more than one institution that has implemented a university strategic plan and is
64
evaluating the results. Therefore, the first step was to select the sites and participants.
Since the data collection employed a survey and interviews as sources of evidence, a
survey instrument was constructed and interview questions were created. Once these
steps were completed, the data collection began. A flowchart of the overall procedure is
located in Appendix A.
Site and Participant Selection. This case study sought to gather in-depth data
about higher education institutions’ methodology for assessing their strategic plans. The
sites were selected based on these fundamental criteria:
1. The institution had a strategic plan initiative that had been in place for a
number of years and the institution was at least beginning the evaluation
assessment of its initiative. This assumes that the institution had completed its
planning, resource development, and implementation phases.
2. The institution was a public entity where data are in the public domain, easing
the tension that sometimes surrounds a study including salary and financial
data that may be key measures used in their strategic plan.
In order to protect the participants and offer confidentiality, the institutions were
not named in this dissertation.
In order to do a comparison and contrast of similar institutions, institutions were
selected for the multiple cases that fit the same criteria. Institutions that had the following
key similarities were reviewed:
65
1. Located within the United States (due to accreditation standards and
governmental funding procedures)
2. Member of the Association of American Universities (AAU) (due to similar
characteristics). Institutions are invited to be members of the AAU based upon
the breadth and quality of their university programs of research and graduate
education, institutional missions, characteristics and trajectory. To be an AAU
member, specific indicators are reviewed: competitively funded federal
research support; membership in the National Academies (National Academy
of Science, National Academy of Engineering, and the Institute of Medicine);
National Research Council faculty quality ratings; faculty arts and humanities
awards, fellowships, and memberships; citations; USDA, state, and industrial
research funding; doctoral education; number of postdoctoral appointees; and
undergraduate education (AAU, 2005).
A minimum of four institutions fitting these parameters and willing to participate
in this study were selected. This number is a representative sample, given the scope and
resources of this research study.
The researcher sent an e-mail request to the list-serve of the Association of
American Universities Data Exchange (AAUDE) representatives asking if their
institution had a university wide strategic plan. The representatives are typically
employed in the budget, planning, or institutional research offices. There were 14
responses. Of the remaining 49 institutions, the researcher reviewed their websites. If key
web pages (president’s or chancellor’s web page, provost’s or vice president for
academic affairs web page, Office of Institutional Research web page, or a page or site
66
referring to their strategic planning) did not reveal a strategic plan, multiple descriptors
were used in the institution’s web search engine (e.g., strategic plan, strategic directions,
academic plan) to find a university-wide strategic plan for that institution. The researcher
contacted the Director of Institutional Research if clarification was needed on either
identifying the strategic plan, or on the specific nature of their plan. The list of AAU
institutions was condensed to six institutions that met the research criteria. Based upon
the correspondence and available information, four institutions were selected.
Data Collection
Within each selected institution, the data collection for this study followed the
basic principle for collection of case study evidence, triangulation. Triangulation is a
common process of using multiple sources of data to allow the researcher to develop
multiple measures of the same phenomenon, thereby enhancing the validity of the data
collection process. This study used document analysis, a survey questionnaire, and
personal interviews as the three sources of data.
Document Analysis. The first source of evidence for this study was the analysis
of public documents on strategic plan data and information, e.g. documents that are in the
public domain such as memoranda, agendas, or reports. The researcher perused the
institutions’ websites for strategic planning documents. The strategic plan, annual reports
of the strategic plan, committee meeting minutes, budgetary or funding documents, and
67
any other related documents on the assessment or outcome of the strategic plan that were
publicly available were reviewed (see Appendix B). If the documents were not available
on the website, or additional documents were needed, the researcher contacted the
institution to collect the material by postal mail or in person upon the campus visit. Once
gathered, the documents were analyzed and coded for correspondence to the logic model
components and for thematic content.
The procedure for the document analysis was as follows:
1. Reviewed the website for documents.
2. Printed and dated each relevant document.
3. If key documents were missing, the institution was contacted for a paper copy
of the document (if available).
4. The documents were analyzed by reviewing for thematic content, and coded.
5. Emergent patterns and themes were noted.
Survey Instrument. One of the sources of evidence was data collected by a
survey of key participants. The Strategic Planning Evaluation Survey instrument (see
Appendix C) emerged from the literature regarding strategic planning. The Likert-type
scale survey questions were written by Dr. William Nunez for his study of “Faculty and
Academic Administrator Support for Strategic Planning in the Context of Postsecondary
Reform,” which in turn were derived from previous research (Nunez, 2003). Expert
review and a factor analysis were used by Dr. Nunez to ensure the validity of the survey
68
questions. The open ended questions, specifically included for this study, were designed
by the researcher.
The survey inquired about the methodology used and the participants engaged in
the assessment of the institution’s strategic plan. Each question was linked to a
component of a logic model.
Staff or administrators in the Offices of the President, Provost, Institutional
Research, and other presidential cabinet members, strategic plan task force, or committee
members involved with their strategic plan’s evaluation were queried. It is highly likely
that members of the President’s and Provost’s staff are significantly engaged in their
institution’s strategic plan initiative. Additionally, the Institutional Research practitioners
are typically responsible for collecting and reporting the data on the strategic plan and
play a key role in the development and execution of the evaluation process. Therefore,
surveying these offices within the selected institutions was ideal in order to observe the
evaluation methodology of the strategic plan. The specific individuals and their electronic
addresses (e-mail) were found through their university’s directory. An e-mail invitation
with a link to an on-line survey was sent to them (see Appendix D). The online survey
used the Hosted Survey software, and was anonymous per institution.
The procedure for the survey was as follows:
1. The mailing list was complied.
2. The survey and invitational letter was mailed.
3. A reminder letter with a link to the survey was e-mailed seven days after the
initial mailing.
69
4. A second reminder letter with a link to the survey was e-mailed 14 days after
the initial mailing.
5. A final reminder letter with a link to the survey was e-mailed 21 days after the
initial mailing.
6. The completed surveys were analyzed by entering the responses of the Likert-
scale questions (converted into a numeric scale) into a Microsoft Excel
Worksheet.
7. A SAS (Statistical Analysis Software) program was written, stipulating the
descriptive statistical analyses that were used to assist with the analyses of the
data.
8. The responses to the open-ended questions were reviewed for thematic
content and coded.
9. Emergent patterns and themes from the responses to the open-ended questions
were noted.
Semi-structure Interviews. The third source of evidence was semi-structured
interviews of administrators at the selected institutions who play a key role in the
strategic plan’s evaluation. Standardized interview questions were employed, in order to
reduce interviewer effect, facilitate data collection, provide continuity, and assist data
analysis. As with the survey questions, each interview question was linked to a
component of a logic model. The standardized interview questions facilitated consistency
in the data collection process (see Appendix E). The interviews were semi-structured, in
70
that the questions and sequence were specified in advance in an outline form, yet the
interviewer decided the wording of the questions during the course of the interview. The
questions provided a more comprehensive data collection, yet the interviews remained
fairly conversational and situational (Patton, 1987). As Patton (1987) suggests, this style
of interviewing allows the interviewer to build a conversation within a particular
predetermined subject.
The interview sample for this case study was purposefully limited to staff or
administrators from the institutions to focus on the institutions’ evaluation processes.
Depending on the organizational structure of the institution, individuals from the Offices
of the President, Provost, Institutional Research, president’s cabinet, Strategic Plan Task
Force, or other committees that are involved with the strategic plan evaluation were
selected for interviews.
Each potential interview participant was contacted initially by mail, then by a
follow-up telephone call to schedule an interview (see Appendix F). The cooperation of
the participant was then confirmed. If that person agreed to participate then the researcher
scheduled a 60 minute interview at the participants’ choice of location. Interviews were
conducted by the researcher and were recorded using audiotape.
With permission, the interviews were recorded to capture full and exact
quotations for analysis and reporting. Also, the researcher took notes during the
interview. After the interview, the researcher transcribed the recording and sent the
transcription to the interviewee to make certain the content was correct, and to act as a
validation check. The interviews were analyzed for thematic content.
The procedure for selecting those to be interviews was as follows:
71
1. Using the results from the document analysis and the survey, a list of possible
participants was drawn up.
2. The possible participant was contacted first by mail, then by a follow-up
telephone call to schedule an interview.
3. The cooperation of the participant was confirmed.
4. The interview was conducted.
5. After the interview, the notes and recording were transcribed word for word.
6. The transcription was sent for validation to the interviewee.
7. The interview summaries were approved by the participant.
8. The summaries were analyzed by reviewing for thematic content, and coded.
9. Emergent patterns and themes were noted.
Data Analysis and Validation
Yin (1994) suggested that case studies have a general analytic strategy in order to
guide the decision regarding what will be analyzed and for what reason. The analytic
strategy for this study was to use a logic model to guide the descriptive framework of the
steps or methods of the institutions’ evaluations. To the extent that the evaluation of a
strategic plan can be described within a logic model, as a sequence of decisions and/or
activities, the research focused on the type of methods that were employed for the
evaluation.
The analysis of the study began with the case data. The case data consisted of all
the information gathered at each institution: the documents, interview data, and survey
72
results. The case data were accumulated, organized, and classified into a manageable case
record. The case records were used to write the narrative, a descriptive picture of how
higher education institutions evaluate strategic plans. Once the evidence was displayed in
the logic model components, the narratives were then presented thematically to provide a
holistic portrayal of these institutions’ evaluation methodology in the chapter four
(Patton, 1987).
To establish the quality of the research, validity and reliability tests were applied
to the study. The study was based upon multiple sources of data, which, in itself,
provided the triangulation that confirmed the construct validity of the processes.
Furthermore, the use of multiple cases within this research provided external validity as
well. The use of replication (multiple cases) established that the study’s finding can be
generalized (Yin, 1994). The consistency in the overall patterns of the data from different
sources and reasonable explanations for differences in data from various sources
contributed to the internal validity of the findings. In additional, the use of protocols for
each data collection technique assisted in the credibility of the reliability.
Within the data sources themselves, the documents were examined for content
analysis. As Patton (1987) describes, “content analysis involves identifying coherent and
important examples, themes, and patterns in the data” (p. 149). Inductive analysis was
also used in identifying emergent patterns, themes and categories that came out of the
data. Using the other two data collection techniques (interviews and survey) help
strengthened and corroborated the document analysis.
The second source of data, the survey instrument emerged from the literature
regarding strategic planning. The Likert-type scale survey questions were written by Dr.
73
William Nunez for his study of “Faculty and Academic Administrators Support for
Strategic Planning in the Context of Postsecondary Reform,” which in turn were derived
from previous research (Nunez, 2003). In his study, expert review and a factor analysis
were used to ensure the validity of the survey questions. In addition, Dr. Nunez
performed a test of reliability; Cronbach’s alpha with a coefficient of .70 was used on the
survey instrument. It provided an estimate of the internal consistency reliability by
determining how all items on the survey related to all other items and to the total survey
and therefore, measured how reproducible the survey instrument’s data were.
In order to measure the content validity of the third source of data, the interview,
expert evaluation of the questions was sought. A panel of higher education experts in
strategic planning and survey methodology was asked to review the interview questions.
Each panel member received a copy of the interview questions and the first chapter of the
dissertation proposal. Using an assessment document (see Appendix G) that incorporated
a five-point Likert-type scale (excellent = 5, poor = 1) for each question, the panel
members were asked to evaluate the questions and rate the degree to which the items
were appropriate as measures. The panel members were provided the opportunity to
provide comments as well. The researcher reviewed the rankings and comments and
revised or removed items based on the panel member feedback.
74
Name Dr. Carol Kominski Dr. Victor Borden Mr. Darrin Harris
Position Associate Vice President for Institutional Effectiveness and Research
Associate Vice President, University Planning, Institutional Research and Accountability
Consultant for the Office of Quality Improvement
Institution Texas Woman’s University
Indiana University University of Wisconsin
Figure 3.1: Expert Panel Members
Once the interview was conducted, the individual interview summaries were sent
to the participant for their review, acting as a member check. Thematic codes were
developed based on the review of the interviews per institution. The interviews and
thematic codes were read independently by another reader. The researcher and second
reader met to discuss the codes and compare responses, resulting in an inter-coder rating
of 98%. The data obtained through the interviews were collaborated by the other two data
collection techniques. A few other methods were employed as well, in order to determine
the trustworthiness of the interview results. On-site interviews offered an opportunity to
spend some time in that institution’s culture that provided a contextual framework as well
as the opportunity to identify characteristics or aspects of the culture that contributed to
the study of that institution’s strategic plan process. In addition, by interviewing multiple
people at each institution, the multiple perspectives or interpretations strengthened the
validity of the findings.
75
Summary
As discussed in this chapter a revelatory case study can provide an examination of
a situation that has not previously been scientifically investigated. This research, through
the development of an in-depth case study, has gone beyond the typical study of the
implementation of a strategic plan in a higher education institution, proceeding to a more
substantive analysis of how strategic plan initiatives are evaluated. By reviewing the
evaluation methods and procedures of strategic planning processes in higher education
institutions, themes or trends were identified that encouraged of the development of a
new evaluation model described in chapter five. The results of this study thus lead to an
improved understanding of the evaluation methodologies employed to assess a strategic
planning initiative at higher education institutions and provide real-life examples that are
being tested and used, thereby providing a descriptive framework. The study also
provides insight for practitioners of strategic planning into the practical implications of
evaluating their strategic plans.
76
CHAPTER IV: RESEARCH RESULTS
The research for this study has focused on the methodologies employed by public
research-intensive higher education institutions in the evaluation of their strategic plan
initiatives. Four institutions were selected as case studies. Their selection was based on
two fundamental criteria: a location within the United States, and an affiliation with the
Association of American Universities (AAU). Furthermore, each selected institution had
a university-wide strategic plan initiative that had been in place for a number of years;
was in the initial stage of its evaluation assessment; and was willing to participate in this
study. The four case studies were all large public, research extensive institutions, with
student enrollments of over 25,000 students. They all employed 15,000 or more people
and had annual budgets over $1.2 billion. To protect the confidentiality of the
participants, and in accordance with standard research practices, the names of the
institutions will not be revealed.
The results reported in this chapter stem from the combination of three types of
data collection: document analysis, surveys, and interviews. This chapter, therefore,
features detailed examinations of the documents, tables and figures from the survey
results, and direct quotes from the participants. The data were synchronized according to
the logic model components to ensure that the research results will provide a
77
comprehensive analysis of the evaluation process. The following table provides an
overview of sources of evidence from the four universities (see Figure 4.1).
University Documents Reviewed
Survey Participants
Interview Participants
A N = 24 N = 40 45% Response Rate
1. Provost 2. Director of Strategic Planning and Assessment 3. Vice Provost for Research 4. Vice Provost for Engagement 5. Director of Budget and Fiscal Planning 6. Director of Institutional Research
B N = 33 N = 37 24% Response Rate
1. Executive Vice President of Business and Treasurer 2. Executive Vice Provost for Academic Affairs 3. Director of Institutional Research
C N = 30 N = 26 35% Response Rate
1. Executive Vice President of Business and Treasurer 2. Executive Director of the Office of Planning and Institutional Assessment 3. Director of Planning Research and Assessment
D N = 38 N = 39 23% Response Rate
1. Executive Associate Provost 2. Interim Director of Financial Planning and Budget
Figure 4.1: Sources of Evidence Overview
This chapter is organized according to the universities studied. For each
university, the section begins with a description of the sources of evidence, with the
attending data coded into the logic model components. A matrix is provided as a visual
reference indicating how strong or weak the evidence sources were in each one of the
logic model components. The appendices present the list of documents, detailed survey
results in tables, and lists of positions of those who responded to the survey and took part
in interviews. Although all the quotations from the strategic planning documents are open
78
to the public and available on websites and in published reports, the results have been
written here in such a way to protect the confidentiality of the survey and interview
participants. Neither the institution nor the participants are named, in order to safeguard
their confidentiality.
A description of each university’s evaluation methodology is presented after the
matrix. For each university, the evaluation process is defined by describing the key
participants, or main evaluators, the measures, and the reports. Also, each university
section is organized by themes taken from the data to further guide the reader. Although
there are some similar themes between the four case studies, each university had some
unique characteristics within their strategic planning process that influenced the
evaluation of their plan. The case studies are written in such a way that each one tells the
story of how the institution has evaluated its strategic plan.
The synthesis of the results is presented in the cross case analysis. The chapter
concludes with a conceptual model, which was produced from this research.
University A
With an enrollment of approximately 40,000 and an annual budget of $1.4 billion,
University A serves as the leading public higher education institution of its state. Along
with its growing regional and national stature, though, has come heightened public
oversight and demands for accountability, which in turn has led to an increased reliance
on strategic planning in recent years. The university is currently in the fifth year of its
six-year strategic plan initiative, which has called for an annual $156 million in new
resources to be used to support and facilitate learning, research, and community
79
engagement. The resources identified to support the strategic plan were: state
appropriations, federal appropriations, fees and tuition, sponsored funding, internal
reallocation, private giving, revenues for licenses and patents, and revenues from other
sources.
Although data have been collected and annually reported on each of the strategic
plan metrics and benchmarks for the last five years, this university is now beginning to
summarize and evaluate the results for a comprehensive review of its plan through some
additional activities and reports.
University A’s strategic plan included a mission, vision, and three major goals: 1)
to achieve and sustain preeminence in discovery; 2) to attain and preserve excellence in
learning through programs of superior quality and value in every academic discipline;
and 3) to effectively address the needs of society through engagement. Each goal had
several characteristics, resulting in key strategies that had specific measures. In addition
to the three main goals, overarching strategies with measures of their own figured into the
process as well. Some of these measures were then compared to eleven peer institutions.
In all, there were 49 metrics and 24 benchmarks.
Sources of Evidence and Logic Model
Sources of evidence for University A included public documents, the Strategic
Planning Evaluation Survey, and interviews of key personnel. The following documents
were analyzed: the strategic plan, the annual reports, news releases, the strategic plan
website and its supporting documents, as well as the Board of Trustees’ website, the
80
president’s website and the Office of Institutional Research’s website (see Appendix H).
The Strategic Planning Evaluation Survey (see Appendix C) was e-mailed to forty upper-
level administrators at University A, including the original Strategic Plan Task Force and
the Strategic Plan Review Committee, both of which had memberships representing the
vice presidents, deans, faculty, administrators, and clerical and support staff. The survey
was also e-mailed to those deans and vice presidents not serving on the other two
committees, as well as the faculty senate chairman, the managing director of the
treasurer’s office, and the director of institutional research. A total of eighteen responses
were received, providing a response rate of 45% (see Appendix K). In the open-ended
question asking respondents to state the office or area where they were employed, ten
replied that they were in academic areas, and the other eight indicated an affiliation with
administrative units. The survey questions themselves were linked to the logic model
components. Interviews of six members of the upper administration at University A were
also conducted with participants from the Offices of the President, Provost, Institutional
Research, and Budget and Fiscal Planning (see Appendix I). Consistency in data
collection was achieved by asking standardized interview questions linked to the logic
model components (see Appendices J and K). A visual reference of how strong or weak
the research sources were for each one of the logic model components is presented in the
logic model evidence matrix of University A (see Figure 4.2).
81
Sources of Evidence Document
Analysis Survey Results Interview
Results Input
Reflections made on the use of resources, including human, financial, organizational, and community, that have been invested into the strategic plan initiative.
5 references Four items with responses that ranged from 72% to 100% for somewhat agree or strongly agree
14 responses
Activities
Review of the processes and actions that were an intentional part of the implementation, such as tools, technology, and services.
2 references 6 question with responses that ranged from 77% to 94% for agree or strongly agree
11 responses
Output
Whether the evaluation methodology included gathering data or results on the activities.
10 references 3 question with responses that ranged from 83% to 94% for agree or strongly agree
13 responses
Outcomes
Short-term and long-term outcomes consisting of specific changes resulting from the outputs, including attitudes, behaviors, knowledge, and/or skills.
16 references 4 question with responses that ranged from 83% to 94% for agree or strongly agree
24 responses Log
ic M
odel
Com
pone
nts
Impact
Fundamental intended or unintended changes occurring in the organization or community
9 references 7 question - responses ranged from 83% to 94% for agree or strongly agree
21 responses
Figure 4.2: Matrix of University A’s Evidence by Logic Model
82
Evaluation Methodology
At the onset of its strategic plan, a task force was assembled to establish an
outline for what the university wanted to achieve, how it would achieve it, what measures
would be used to determine progress, and what measures would be used to determine
when the objectives had been reached. Significantly, though, there was no mention of
how exactly the university would assess the plan at the completion of the initiative. As
University A’s administrators are just now beginning a comprehensive evaluation of their
strategic plan, there has not been a public summary or comprehensive report to date.
However, there have been several news releases (e.g. newspaper articles and web news
announcements) discussing the impending review of the strategic plan.
In response to the Strategic Planning Evaluation Survey’s questions about the
evaluation methodology, 88% responded that they had evaluated and offered advice on
the strategic plan, and 95% agreed that the strategic plan activities had helped to
accomplish University A’s mission. An open-ended question asked respondents to
describe briefly the evaluation methodology at their institution. Only one stated that
he/she did not know of any methods or steps. The most common response (N=14, 77%)
was that University A used measures and reported its progress in an annual report.
During the interviews, participants were asked to discuss how the institution evaluated
the strategic plan. Most replied that the specific measures were key to evaluating the
strategic plan. As some of these measures were used as benchmarks against other peer
institutions, these comparisons were spoken of as a way to evaluate the plan as illustrated
by these quotes:
83
Some of them are very simple quantitative measures where we have benchmarked ourselves against our peers on certain metrics. We simply look at their numbers every year, we look at our numbers, and compare to see if we have made any movement.
First, you have to look at what the goal really met and the characteristics that define attainment of the goal. Then, looking at those characteristics and the measures that go with each, report on the progress…. Also, it would be compared with the peer group—to see if it has moved in a positive direction to the peers. If it appears progress has been made, but you have also fallen behind your peers compared to when you started, then I would say it was not sufficient progress.
Four survey respondents (22%) mentioned that a review committee was assigned
to evaluate the plan. This was certainly borne out in the following detailed response:
[An] annual assessment of progress with a report [including] metrics/benchmarking data, [an] investment return analysis, [and] a five-year comprehensive review—retrospective and prospective—focusing on the "big picture" with salient aspects of goal achievement and recommendations... conducted by [an] across-campus review committee with a report to the president and the Board of Trustees.
Thus, even though a formal review has not been made public, there is evidence clearly
acknowledging the beginnings of an evaluation.
Main Evaluators. After the fifth year of the strategic plan, a review committee of
approximately fifteen university employees was formed. Faculty, administrators, staff,
and student representatives served on this committee, which articulated a comprehensive
view of the strategic plan, that was passed along, in turn, to the president and Board of
Trustees. However, there was a mixed reaction when the participants were asked who
was involved in the evaluation of the strategic plan. The three general responses given
84
indicated that the review committee, the upper administration, or all levels of the
university were involved in evaluating the plan. This mixed reaction may be attributable
to a lack of publicity regarding the role of the review committee and/ or the fact that its
findings were not made public.
Seven respondents (39%) in the survey stated that the upper administration was
responsible for the evaluation. Some of the typical responses in the interviews were:
“I’m assuming that the president’s cabinet has discussed the progress of the
strategic plan on a regular basis.”
“That happens at a really high level. Who is able to say [if] things are not
going as planned? Probably the provost’s office.”
One administrator summed it up this way:
Clearly, the [president’s] office, and the Office of Institutional Research [are actively involved]. The deans are very sensitive to this and they are actively involved with this every day. I would say in some colleges department heads are involved, they may have made it part of their evaluation thinking as well. It is probably not part of the everyday thinking of many faculty members, which is probably okay. I don’t know if it needs to be. It is those of us that are struggling with the overall directions and policies of the institution that need to worry about strategic planning.
The other participants thought that all levels of the university were involved in the
evaluation of the plan. One-half (N=9, 50%) of the survey respondents stated that the
evaluation was at all levels. As one subject commented: “I think the whole leadership of
the university is involved, down to the department level.” Another noted in more detail:
“It is at all levels because we collect data from the bottom up. Even at the departmental
level they [provide] information. It is … pulled [together] all of the academic
information…and the non-academic information.”
85
Reports. Each November the president of University A reported on the status of
the strategic plan to the Board of Trustees. In addition to this presentation, the annual
report was prepared. This report was a formal three-part document about progress made
on the strategic plan, comprised of the President’s Report (providing both a general
overview of progress and a message from the president); a one-page executive summary
“Progress on the Strategic Plan”; and a detailed report providing an update on each metric
and benchmark titled “Benchmarks and Metrics Report.” This annual report was
distributed widely and made available to the public, thus providing the campus and
community with an update regarding progress made toward achieving the strategic plan.
This report was often brought up by those interviewed, most of whom recounted
that University A periodically presented specific strategic plan data to the Board of
Trustees, in addition to furnishing key stakeholders and the public with an annual
published report:
We have an annual grand accounting to the Board of Trustees. It is an annual ongoing process. Once a year we report to our Board of Trustees. We collect metrics on every one of the strategic plan’s initiatives. It’s a continual ongoing process of collecting data, information, and then making reports for the Board of Trustees. On the annual basis a three-part report is prepared for the Board of Trustees. There is a corporate view that provides a broad overview of the status of the plan for the past year. There is a more detailed summary of the specific goal areas … [and] a more detailed report prepared by my office. In addition to that, there is the [report] that is really the nitty-gritty detail that is given to the Board members, the President’s cabinet, and to the upper-level administration of the university.
Measures. There were a number of measures that University A used to track
progress. These measures were collected and reported every year for the annual report.
86
Within the survey, 94% of the respondents agreed that University A had a formal process
for gathering and sharing internal and external data. The process of defining, collecting,
analyzing, and reporting these measures seemed to be well understood. Consequently,
when asked about the measures, it is not surprising that people could elaborate on
process. In the interviews, most said that the original measures were still the primary
factors with regard to evaluation, even if they may have been modified a bit. Two
participants elaborated on what they meant by “modifying” the measures:
As we went through and refined things, we did not make any drastic changes in terms of the metrics. When the committee first created the metrics, some ideas were not necessarily practical or possible. There was some struggle to interpret what they may have meant and to qualify their idea. Although …those metrics were kept since the committee decided that they wanted them.
When the strategic plan was first developed, the measures weren’t precisely stated. The benchmarks and metrics were subject to definition…. There were a few rare cases in which we had to modify the definitions based on the availability of information or on further review of what the measure really meant and what was of value to us. However, for the most part, I would say we maintained the same set of measures.
Mid-course Adjustment. Three years into the University A’s strategic plan, a
“mid-course adjustment” was performed due to three factors: the need to modernize
administrative computing applications, the need for attention to the repair and
rehabilitation of existing buildings, and the decision to raise the goal of the capital
campaign due to its early successes. The mid-course adjustment thus modified goals and
resources of the strategic plan, prompting one of the participants to note:
87
We have on several occasions made course corrections. I’ll give you two examples. The campaign … had an original goal to raise $1.3 billion. Very quickly we realized we could do better than that and we raised the goal to $1.5 billion. That was an early correction. We initially said we would add three hundred faculty to the campus over a five-year period. It became apparent to me that that it would take closer to seven years. The course correction was to adjust that out to seven years. Actually, we took the campaign out to seven years as well. It just made sense to do that.
Another administrator reflected on the sensible need for a mid-course correction:
Why didn’t we realize that [repair and rehabilitation] problem and make it part of the strategic initiative earlier? You’ve seen the numbers—this didn’t happen overnight. It should have been part of the examination when we established the plan. Or maybe they weren’t open to it when the plan was created. Maybe that is a flaw in the process.
Almost as if responding to this view, another interviewee had this to say: Any strategic plan has to be able to absorb mid-course corrections without having to revise the plan entirely. There needs to be some resiliency with the plan that so that if external conditions shift, you are able to make adjustments without having to change your plan. In our case, an example was once again due to the insufficiency of the state funds. The university did not receive funds for facilities. Because of that we had to create a new mechanism which included some additional fees, but also included new reallocation of funds to remedy that over time. A positive example is the fundraising campaign. We were meeting the goals and so we increased them.
The idea behind a mid-course correction, then, was that it allowed for flexibility in
dealing either with unforeseen obstacles threatening to derail the plan, or with surprising
opportunities that could just as easily make the plan irrelevant. In the words of one
interviewee:
As long as you remember that the decisions can be tweaked, modified, you go in a general direction, a strategic direction for the institutions. For me that is the real strategic element of what we do. It’s the process that is more important than the specific goals, since they may not be right.
88
Integrating Budgeting with Planning
When participants were asked if the evaluation of the strategic plan was an
integral part of the planning process, all responded resoundingly in the affirmative:
Yes, it’s critical. Reviewing the measures over time and establishing base line data is a critical and an essential component [of the evaluation]. If those measures hadn’t been looked at on an ongoing basis, I don’t think we would have been as effective in evaluating our plan as we have been. Oh yes. It drives everything we do. All decisions are at least put into the context of the strategic plan.
University A certainly came to recognize the sometimes-fragile interdependence of
various funding sources. From the beginning, the strategic plan listed key areas for funds
in supporting the strategic plan: state appropriations, federal appropriations, fees and
tuition, sponsored funding, internal reallocation, private giving, revenues for licenses and
patents, and revenues from other sources. When the interview participants were asked if
the resources that were needed to implement the strategic plan were received as expected,
some firmly answered “yes.” Others, however, added that state funding had not matched
expectations:
We laid out a resource-planning scheme, and we have exceeded in some areas and fallen short in one particular area that has to do with the state support. The state’s economy was affected by post-9/11…... We did not receive what we had assumed or anticipated that the state would be able to provide. The state did provide us some new funding for new initiatives, but not as much as we had expected. On the other hand, the private fundraising had exceeded the initial goals and is still exceeding the goal. The state appropriations were not what we expected them to be. Additional funding for those areas as I mentioned before [in reference to the mid-course correction] was needed so we had to add a special fee for students to come up with that money for repair and rehabilitation, for example. To the extent that we could allocate what was available to us was pretty much kept in line. However, because of the additions, additional funding that was required and because we didn’t get the appropriations we had expected.
89
When asked if the allocation of those funds followed the original plan, all interviewees
agreed:
It was programmed ahead of time, and we basically marched right in tune. Other than our mid-course correction, we stuck to our plan of where we were able to raise the resources for our allocations. We did a pretty good job at that.
Yes, they did, except where we didn’t get the state funding. A few initiatives were slowed, but in several instances, we had a backflow that bridged the gap with other sources of funds. So for the most part, yes. Yes, with one exception. We assumed at the beginning that we had five sources of funding. We assumed that we would fundraise; that has worked fine. We assumed that we would reallocate; that has worked fine. We had assumed we would increase our sponsored program activity; that has worked fine. We assumed we would raise tuition; that worked fine. We assumed that the state would, at least, stay the course, and that hasn’t worked at all. I’ve been here six years, and if we get the budget increases that we are recommended this year, that will be the first time in six years that we have seen any budget increase from the state. So all the progress we’ve made, which has been substantial, is from these other four sources.
In the responses to the survey, 83% thought the results of the strategic plan had
been used to evaluate the distribution of resources. The increase in revenues from the
various sources, along with the redirection of existing resources, were used to fund new
opportunities with the greatest impact for strategic priorities. The strategic plan
established the framework for setting the annual priorities and guiding the major
budgetary decisions. Many of those interviewed spoke of new initiatives established by
the academic colleges that were related to the strategic goals of the university. To get
university funding for these new initiatives, the colleges had to demonstrate how they had
first internally reallocated funds and how their initiative would advance the university’s
overall strategic goals. One described it in this way:
90
We have a requirement of a 2% reallocation of funds… It’s an idea of recycling existing money, which can then be targeted and focused toward the strategic objectives… When [the president] is evaluating the incremental requests for funds, his first question is: What have you done for yourself? He is more inclined to fund those activities where someone has come up with a part, providing their own share from whatever funds they can reallocate. In the fact that those questions get asked in the context of the strategic plan, it’s just another thing that reinforces the effectiveness of the plan…. It is an ongoing thing; it is integrated with the budget planning.
One upper-level administrator described her experience with these types of strategic
initiatives:
I would take a program that had potential, had demonstrated that, was willing to reallocate its own resources and take steps, have a plan, and show me how it would do it. I would help them. And we have tried to do it in a few cases. With mixed success I think it has worked very well, and in a few cases it worked less well.
All of the interviewees stated that there was a direct relationship between funding
decisions and the strategic planning initiatives. This was most plainly stated by one
participant:
The strategic plan goals, priorities, and strategies are intrinsically connected with the budget allocation process on an annual basis. We do an annual review of the actual funding allocation to the goals and the corresponding changes in the metrics associated with the goals. Every year the budget document is compared showing direct connections. In other words, all the dollars have been laid out in terms of strategic plan goals and priorities.
Communication
Besides the annual report and presentation to the Board of Trustees, the president
also shared the results of the progress on the strategic plan with the campus and
community through various venues. Typically, his public speeches included several
91
references to the strategic plan. There was also a strategic plan website linked to the
university’s main webpage, president’s website and that of the Office of Institutional
Research as well. Moreover, news articles in the local, student, and in-house newspapers
periodically reported on the progress of the strategic plan. One administrator said that
when people are asked at other institutions “if their university has a strategic plan, they
might be able to answer yes or no, [but] would be hard-pressed to answer what the major
elements were. If you asked people around here, they would probably be able to tell you.”
Another respondent agreed:
I just think one of the main reasons from my vantage point that the strategic planning that has been so successful at [University A] has been because of all the various communication efforts that were made. I think that has to do with the leadership of [the president] and his community visits. I don’t think there is person on campus in one fashion or another that doesn’t know that we have been undergoing a strategic planning effort over these years, and I think that has been a beneficial thing.
Some saw the strategic plan as an effective communication tool in itself, since it clearly
articulated University A’s mission and vision. This view was reflected in several
interview responses:
It is a very effective tool to get people thinking. Strategic planning is not about getting everybody to do what you say. You get them pointed in the same direction. It is a good communication mechanism with the central administration. That is part of the thing about strategic planning, it helps articulate to the stakeholders what the key elements [are] and what we are trying to target.
By using the strategic plan to convey the direction it wanted to go, University A
effectively facilitated positive interactions with the state legislature and increased its
general fundraising efforts. One respondent with specific budgetary and financial
planning duties for the university reiterated this point:
92
With the strategic planning activity, and the marketing of it, we have actually increased the amount of resources the institutions has, even in these dire times. The president was insightful; he did his homework and marketing up front… Did the [strategic planning] activity give us any more money? I am absolutely certain it did.
Culture
Some of the interviewees noted that University A experienced an overall change
in culture after having adopted strategic planning. Using elements of planning such as
implementation, resource allocation, and the integration of budgeting and planning, a new
mindset was created and quickly became pervasive across the campus. One administrator
described the transformation this way:
Compared to six or seven years ago, that is a dramatic change. We learned about this from the review committee that created the comprehensive report. The strategic plan has transformed the culture of the university and moved it to a different level. Everyone asks for data before making a decision, even at the dean’s level. It is data-driven decisions.
Thus, while effective and inspiring leadership is certainly a necessary component in
motivating an institution to adopt strategic planning, it is also clear that a university’s
culture must likewise adapt to the point where core internal constituencies are willing
(and in some cases eager) to take the initiative as well. As one respondent put it: “If you
don’t get the buy-in, then it will not work. There needs to be an excitement, a passion for
this, which I think we have seen at this university.”
The chairman of the Board of Trustees was noted in an in-house news article as
having said that a formal review of the strategic plan had been undertaken, and there had
been a preliminary discussion of future goals for the next plan. His statement implied that
in developing the requirements for the next plan, significant consideration had been
93
accorded to the review of the current plan. In addition, the president of University A had
been traveling around the state during the final year of the plan to receive comments on
how well the strategic plan had been received by the community, and to receive
suggestions on what goals the university should pursue in the future. These comments
and actions, along with other reports generated (particularly that of the review
committee) illustrate that University A has channeled great effort toward a systematic
evaluation of its strategic plan. This characterization is also borne out in the fact that the
data points for this particular case study readily correspond with the logic model
components. With this in mind, the comments of one interview participant are
particularly prescient: “It has been quite an exciting six years. This is one of the few
strategic plans that I have had the pleasure of working on and implementing that has
worked.”
University B
Like the first institution, University B has risen to regional and national
prominence in recent decades. It features an annual budget of $3.7 billion and enrolls just
over 50,000 students. And like University A, it, too, has recently turned to strategic
planning to solidify and enhance its position in the landscape of American higher
education. Created in 2000, University B’s strategic plan called for an investment of $750
million over the next five years focusing on four key goals: 1) becoming a national leader
in the quality of academic programs, 2) being universally recognized for the quality of
the learning experience, 3) creating an environment that truly values and is enriched by
94
diversity, and 4) expanding the land-grant mission to address society’s most compelling
needs. University B hoped to fund the plan from four sources: additional state support,
funds redirected from university resources, private fundraising, and increased tuition.
Sources of Evidence and Logic Model
Among the sources of evidence for University B are the following public
documents: news releases, speeches, the strategic plan, the annual updates, the annual
reports on strategic indicators, the strategic plan website (with attending documentation),
as well as the websites of the Board of Trustees, the president’s office, and the Office of
Institutional Research (see Appendix H). Another source of evidence is the Strategic
Planning Evaluation Survey (see Appendix C). It was e-mailed to thirty-seven upper-
level administrators and deans at University B, resulting in nine completed responses, a
response rate of 24% (see Appendix L). In the open-ended question asking respondents to
indicate the area in which they were employed, four responded that they were in
administrative units and five replied that they were in academic areas. Three upper-level
administrators were also interviewed at University B. These participants were from the
Offices of the Senior Vice President for Business and Finance, the Provost, and
Institutional Research (see Appendix I). The logic model evidence matrix of University B
(Figure 4.3) provides a visual reference of how strong or weak each of the research
sources were for the logic model components (see Appendices J an L).
95
Sources of Evidence Document
Analysis Survey Results Interview
Results Input
Reflections made on the use of resources, including human, financial, organizational, and community that have been invested into the strategic plan initiative.
10 references Four items with responses that ranged from 67% to 100% for somewhat agree or strongly agree
7 responses
Activities
Review of the processes and actions that were an intentional part of the implementation, such as tools, technology, and services.
15 references 6 question with responses that ranged from 55% to 100% for agree or strongly agree
11 responses
Output
Whether the evaluation methodology included gathering data or results on the activities.
32 references 3 question with responses that ranged from 78% to 89% for agree or strongly agree
13 responses
Outcomes
Short-term and long-term outcomes consisting of specific changes resulting from the outputs, including attitudes, behaviors, knowledge, and/or skills.
18 references 4 question with responses that ranged from 78% to 100% for agree or strongly agree
10 responses Log
ic M
odel
Com
pone
nts
Impact
Fundamental intended or unintended changes occurring in the organization or community
5 references 7 question - responses ranged from 89% to 100% for agree or strongly agree
16 responses
Figure 4.3: Matrix of University B’s Evidence by Logic Model
96
Evaluation Methodology
A public document featured on the President’s website stated that University B
had “established a set of institutional goals and a matrix for evaluation” for its strategic
plan. Even though this was the one and only time a formal evaluation was mentioned,
89% of those responding to the present study’s Strategic Planning Evaluation Survey
indicated that specific activities and outcomes had occurred at the university through an
evaluation of the strategic planning results. Most of those interviewed, however,
described this evaluation as an “informal” process. One described it this way:
I think it happens because in the sessions at the leadership retreat, if we thought that it was on the wrong track we would just change it. So it is really not the same as doing thoughtful evaluation. I think it is just informal, really.
The strategic plan was originally a five-year plan, but was subsequently extended and
expanded for another five years. No formal evaluation was carried out after the first five-
year period, and the university is currently in year seven of this overall strategic plan.
Commenting on the plan’s timeline, one interviewee casually remarked: “That plan was
approved in December 2000, and there are some thoughts about having the plan
reviewed—going through some type of evaluation process. But that hasn’t gotten very far
yet.” Another added:
Well, our strategic plan is pretty generic, which is probably why we’ve been able to have the same one in place for seven years. Saying that we’re going to have a world-class faculty is kind of like “motherhood” in a research university. So I don’t think that we’ve really gone back and evaluated the academic plan. It’s just generally accepted that these are the goals that we want.
Measures. The four goals were specifically addressed with six characteristics and
fourteen supporting objectives for all four goals. Each of these fourteen objectives had a
97
general implementation plan and cost associated with it. Each year an update report
provided information about the progress made toward achieving the goals. A different
report, named the Scorecard, compared University B to its benchmark universities, nine
aspirational peer institutions. Within this report there were thirty-nine measures that
measured the progress made on the six characteristics of the plan.
All of the survey respondents agreed that data collection and analysis were an
integral part of the institution’s strategic planning process. The strategic plan measures
were often brought up in the interviews and described as the sole process of evaluating
the university-level plan. One interviewee reiterated that “[the director of institutional
research] presents institutional level metrics that we compare ourselves with to our peers.
That contextualizes the success, progress, achievements, [and] objectives of the academic
plan.” The measures had stayed relatively the same since their inception, with only minor
changes—as reflected in the following observations:
The [strategic] plan … measures haven’t changed. We might have tinkered with them a little bit, just because sometimes the data change. [For example], with student surveys, we changed from doing the HERI College Student Survey to doing NSSE. So you have things like that, but we’re trying to get at the same issues, basically. In all cases, I would say that the changes have been evolutionary, as people have learned more about the process of measuring performance, collecting metrics, and all of that.
Another interviewee said, “One thing people have all realized is that it is the plan that
works. So we haven’t made any changes at the macro level. The academic plan is still
pretty much the same academic plan that we have always had.”
98
Dialogues and Other Related Methods. Another way in which University B
measured its performance on the strategic plan was to engage in discussions with the
colleges. Colleges scheduled a “biennial dialogue” with the provost and senior leadership
to ensure that they were engaging in activities that supported the overall strategic plan, as
well as to determine their priorities for the following two years. The biennial dialogue
used data situated around the components of the strategic plan to solicit questions,
comments, and/or observations. This has prompted each dean to develop a two-year
leadership action plan for his or her college and for him or herself as the leader of the
college. The provost has linked these leadership plans to the annual performance review
of each dean.
Each academic department and administration support unit was to incorporate and
integrate, as deemed appropriate, the strategic plan goals into its own respective plan.
Support units submitted performance indicators annually and their budget requests were
evaluated with regard to their support of the strategic plan. Academic and support units
engaged in program reviews and had to demonstrate the ways in which they had helped
the university to achieve its goals.
Reports. Each year the institution issued a progress report on its measures. An
annual report provided information on progress made toward achieving its goals,
including specific information by goal and strategy. An additional report, the Scorecard,
compared the university to its aspirational peers according to a set of benchmark
measures. Each year, both sets of reports are available on the university’s main website.
99
Of those responding to this study’s survey, 89% agreed that the institution had a formal
process for gathering and sharing internal and external evaluation data. As one
interviewee described it: “The progress should really show up on the [strategic] plan
Scorecard.”
Although there was not a specific Board of Trustees meeting identified for a
review of the annual report of the strategic plan, it was often referenced in other speeches
or reports given to the Board. As one person put it, “Everything is always related to the
[strategic] plan.” In addition, the president and provost would give a state of the
university address each year to the University Senate and campus community that would
relate activities back to the strategic plan.
Main Evaluators. Although 89% of the survey respondents indicated that they
had evaluated and offered advice about the strategic plan at their institution, only 67% of
the respondents said that faculty, staff, and administrators were actively encouraged to
participate in the institution’s long-range planning process. This latter sentiment was
more consistent with the some of the comments from the interviews. When asked who
was involved in the evaluation of the strategic plan, most pointed principally toward the
upper administration:
It would be the people at the leadership retreat, and they are sort of passively involved. On the academic side, it is the provost, [who] involves the vice provost in the various areas, which they engage with the deans and the governance structure.
100
[It involves] all the people that are involved in the leadership retreat, the cabinet, the council, the faculty leaders, and the Board of Trustees.
On the other hand, the biennial dialogues with the colleges were often mentioned in the
open-ended survey responses. By engaging the colleges in this endeavor and having the
support units report on their own performance indicators, also pegged to the strategic
plan, some participants in this research considered this to be clear evidence of having
incorporated all levels into the evaluation of the plan.
Leadership Retreat and Agenda. Two years after the plan was implemented, the
president of University B left for another institution. The Board of Trustees clearly
expected the next president to retain the strategic plan, with allowances for adaptation to
the President’s style. This was recounted by one respondent who closely followed this
transition:
The hope is when [the president] came here he would stay five to seven years, but he didn’t. Before [the next president] was hired, we went through quite a bit of angst to get that strategic plan in place. We didn’t want to start from scratch. We wanted [the next president] to pick it up and move it forward in the direction that had [a new] imprint on it, but to still be consistent with the broader plan.
As part of that process, the incoming president therefore instituted the annual Leadership
Retreat. Each summer the senior administration would meet at a retreat to determine the
next year’s priorities. From this meeting would emerge the Leadership Agenda, which
identified short-term priorities, specific action steps, and resource allocations, that would
guide them throughout the ensuing academic year. These priorities would not be novel in
101
and of themselves, but rather be embedded within the strategic plan. One respondent
clarified it thus:
The idea was that you take the [strategic] plan… and each year you would pick something to emphasize and that would be the leadership agenda. It’s more the tactical implementation part of the strategic planning process.
Another described it this way:
The President holds a leadership retreat annually… In the leadership retreat, then, the President and senior administrators talk about what parts of the [strategic] plan they want to focus on for this year. So it’s kind of become—and this is a cliché, …. a “living document,” but I think it really is true in this case—so we have this generic document that we don’t really revisit or evaluate; we just accept as being appropriate goals. Then annually we talk about which ones are we going to work towards. And so, in a sense, that’s a kind of an evaluation. It’s a re-thinking about priorities and what’s important, given the new context or whatever’s happened.
So although they had a strategic plan, the Leadership Agenda guided the institution’s
immediate emphases. It was the Leadership Agenda that was most discussed and acted
upon. Therefore, even if the strategic plan goals were often described as broad, the
Leadership Agenda helped to identify more specifically focused action items.
Leadership
Typically, strategic plans are spearheaded by the president, who is strongly
influenced by the Board of Trustees and the Provost. These leaders direct the rest of the
university to engage in a strategic plan. If there are significant changes in this leadership,
the strategic plan is often discarded or radically modified. This is especially the case
when the strategic plan is widely thought to be the creation of the outgoing leader. To
some degree this seems to have been the case at University B, as some interviewees
102
expressed sentiments that the strategic plan was considered to be part of the president’s
endeavor, thus possibly spelling its doom once the outgoing president left. One
administrator noted that “[t]here is some feeling when this strategic plan was put into
place that it was more the President’s plan. There was some concern about the level of
consultation that created it.” Similarly, a college dean said that “[t]he current strategic
plan was developed, however, with little initial input from outside the President, Provost,
and Vice Presidents, and required some modification after it had been prepared.”
Significantly, though, University B had undergone several leadership transitions during
the course of its strategic plan, and was able to retain most elements of the plan and carry
it forward. Respondents summed it up this way:
We have had the same [strategic] plan across two presidents. [The second president] was brought in to stay the course on this [strategic] plan, but to have [the] imprint from the leadership agenda perspective. When [the second president] was brought in, I think [the second president] was given the plan and the Board of Trustees was pretty instrumental in saying, “This is our plan, this is what we want to follow.”
Interestingly, University B’s current president and provost will also be leaving before the
plan’s culmination. The common assumption is that the next president and provost will
continue the plan, much as the current president and provost were expected to do.
Reflecting this context, one administrator mused on the future:
Well, it is kind of hard to say that someone would come in and say “world class faculty, not important. Enhance the quality of teaching, not so much” So, I don’t know. It might be a matter of emphasis. One of our goals of the [strategic] plan is to help build [the state’s] future. That is where we think about things like commercialization, outreach and engagement, all of those kinds of things. Will outreach and engagement be such a big deal to the next person? Maybe, maybe not. …
Another described the change of leadership this way:
103
Now, a lot depends on the personal temperament you have for president. So when [the second president] followed [the first president]…[egos did not] get in the way in doing what needs to get done. [The second president] was willing to pick up what a previous president had done and improve upon it. If this next president is some kind of an egomaniac, like a Donald Trump type of personality, that can all unravel pretty quickly. So a lot of it depends on the nature of the leadership. Same [idea goes for] our provost. [The second provost] was committed to picking up where our previous provost had gone with that. So hopefully the next provost will do the same thing.
Driving home the point of hoped-for continuity, one respondent reflected on the situation:
“I think a new leader will want to put [his or her] mark on it. Maybe not say, ‘Is this
working?’ but ‘Are there ways we can expand it or enhance it?’”
Similarly, University B has also undergone a massive transition in their Board of
Trustees. The Board was comprised of nine members (most appointed by the governor),
many of whom rotate off each year. However, a year and a half ago the legislature
changed the law and expanded the board from nine to seventeen. Thus, two people out of
the nine began to rotate off under the previous system, but then eight new members were
added, making the majority of the board new within the last eighteen months. One
administrator was philosophical about the change, noting that this “is where you see the
value of the plan like this. ... It allows you to at least start on the same page.” Another
interviewee was a bit less sanguine:
[W]e are going through a change in our Board of Trustees and sometimes these things can guide what things they are interested in or not interested in, and now we are going through a change of presidents so my guess is that is going to have some influence.
104
Integrated Budgeting and Planning
One news article prominently stated that University B’s strategic plan had
established the framework for setting the annual priorities, thus guiding major budgetary
decisions. The strategic plan itself had stated preliminary cost estimates and possible
available resources for support of the plan. It acknowledged that these preliminary
scenarios were between $54.5 million and $117.5 million in continuing funds and
between $400 million and $555 million in one-time funds to support the strategic plan
initiatives. However, one interviewee added this clarification:
It is not really the case that when we built the [strategic] plan we put price tags on each thing and then looked for ways to fund it all. Because a lot of the plan wasn’t so much about meeting the target, but getting better. There are some areas where we can attach dollars… But whenever we had had specific dollars attached, those dollars have been allocated. For the most part it has been less specific than that.
The university had expected to fund the plan with state support, funds redirected from
university resources, private fundraising, and increased tuition. However, two events
significantly affected this funding plan. The state economy went into a recession for a
number of years, and the university had to petition the state in order increase their tuition.
While this state has a tuition cap, University B was nevertheless able to gain a temporary
waiver and increase tuition beyond the cap for two years. The following comments are
representative of the comments made by administrators:
We got more [money] in some areas and less in others. I did a projection of what the strategic plan called for, and the biggest change was that we didn’t get the state funds that we thought we would. We adopted the strategic plan in December of 2000, and the economy crashed the next year. That affected state resources. On the other hand, we ended up getting more resources in some of the other areas. So in the end, it was sort of a wash, as I recall. I think we were a little bit naïve in the amount of resources we would be able to generate, particularly from the state.
105
The university had also implemented a new budget model around the same time that the
strategic plan was implemented. As one interviewee commented: “When we were
redoing the budget system, we rightfully said, ‘How can you do a budget system if you
don’t have [a strategic] plan?’ The answer is [that] we are doing those things in parallel.”
This new budget model has been described as an adaptation of Responsibility Centered
Management, a type of incentive-based budget. As described in a news release, the new
model restructured the colleges’ budgets to ensure their base budgets were aligned with
the needs of the strategic plan, in other words “to create budget practices with regard to
the allocation of new revenues … and provide incentives to reduce cost and generate
additional revenues needed to address academic priorities.” There was, however, still a
central pool of money available for financing certain strategic initiatives. Budget requests
were thus evaluated with regard to their support of the goals of the strategic plan. The
resulting situation was described by one administrator like this:
There is a percentage of the central pool that comes off to the provost automatically, for the Provost’s strategic reserve fund. That fund is what is used to invest in strategic ways in collaboration with the other vice presidents to achieve the academic plan.
It was noted that the strategic plan helped determined the distribution of the resources:
“We use the strategic plan to decide where resources are distributed.” Integrating the
planning with the funding was considered a critical component of the success of the plan:
“I think the important part in making this work, is aligning resources with the academic
priorities.” In fact, all of the survey respondents agreed that the benefits of strategic
planning were worth the resources that were invested.
106
[In the budget document for the Board of Trustees] It shows how we link our budget decisions to our academic objectives. We don’t spend any money at this university unless it is contextualized within the [strategic] plan. So this whole [report] is [strategic] plan related. This is another way we link what we do to the [strategic] plan. The level of financing for the strategic plan was reported annually to the Board of
Trustees, and used to help guide the budget process for the following year. Normally, the
provost and the senior vice president for business and finance would make a presentation
on both the financial indicators (which are not a part of the strategic plan) and the
strategic indicators. The presentation was used to help guide and inform the budget
process and the leadership agenda for the following year. It was an annual process tied to
the budget and to the academic planning process. An official with specific responsibilities
for university business and financing concluded, “I think the success [of the strategic
plan] or lack thereof is … always contextualized around what our fiscal capacity is.”
Culture
Interview participants felt that after seven years of having the strategic plan in
place, it had significantly affected the culture of the university. Keeping the strategic plan
at the forefront and constantly referring to it in communications and in unit-level
planning had made everyone aware of the plan. These two comments expressed it best:
One of the things that was kind of interesting when we were interviewing for a provost (the last time, before [the current Provost] had the position), another one of the candidates was on campus and said “Wow, this is amazing. Everywhere I go people know what your academic plan is about.”
107
I think one of the biggest culture changes has been that everyone knows a direction we are all moving in. Giving it a name …, knowing that it is up on the web, you can look at it, the measures are there. We are being really honest, it is not a PR kind of thing. I think it has built an awareness and caused people to kind of coalesce around the goals.
Consistency of the Strategic Plan
The interviewees also reflected broadly on the experience of strategic planning at
their campus. Keeping focused on the same plan from year to year seemed to have helped
them remain attuned to their priorities and have a sense of progress. The following quotes
aptly illustrate this sentiment:
So the degree in which [a strategic] plan can help focus and also have some degree of consistency from year to year really helps in moving a place forward academically, even though you don’t have all the resources you think you can dream up. That has been the real value of this. You have a bit of tension between the people who like the big hairy audacious goals and like to pronounce victory by throwing a lot of money at something and think that is the way to be successful. I think the evidence shows the way to be successful is to know what you are doing and steadily make progress, so over a period of the years you make dramatic progress.
These comments are very consistent with the results from the survey question that asked
whether strategic planning had played an important role in the success of the institution.
The majority readily agreed that it had. Yet they also tended to register concern with
regard to future improvement. One administrator singled out a particular area for
improvement in the next strategic plan:
But we don’t measure anything about fiscal accountability. So maybe to recognize that while academics are the purpose of all this being here, there are really competing non-academic activities that have to happen to make the university work, and those should be evaluated regularly at the center too, as part of the same process.
108
These comments illustrate that even though University B had not undertaken a systematic
evaluation of its plan (particularly regarding finances), at least some stakeholders had
given it careful consideration.
Originally University B’s strategic plan was a five-year vision. The president that
initially headed up the initiative was quoted in a news release as saying:
We consider it to be a living document. We don’t consider it to be perfect. We intend to continue to revise and update this plan as we move forward. I encourage all of you to read it and help us improve it. What didn’t we get right?
Considering his subsequent change of professional plans, the university was clearly
obliged to take him at his word, and indeed continued on with the plan long after his
departure, and far beyond the initial five-year vision. Although there was never a
systematic evaluation of the plan, it has been informally assessed, as several interview
subjects and survey participants have noted. It has thus been possible to link responses
and references found in the documents, survey responses, and interviews to each
component of the logic model, demonstrating that this informal process had served as a
default evaluation method.
University C
While its overall budget and enrollment figures ($3.3 billion and approximately
40,000, respectively) generally correspond to the other institutions considered so far,
University C does feature one unique characteristic with respect to the present study:
strategic planning in one form or another has been practiced at this bastion of American
109
public higher education for decades. In 1997 it established a five-year university-level
plan, followed in turn by a three-year university plan. During the course of this study,
University C was in the final year of its third university-level plan (another three-year
plan), and was gearing up for the next planning cycle. The expectation was that it would
be returning to a five-year cycle. The strategic plan itself set forth six goals:
Enhance academic excellence through the support of high-quality teaching,
research, and service;
Enrich the educational experience of all [University C] students by becoming
a more student-centered university;
Create a more inclusive, civil, and diverse university learning community;
Align missions, programs, and services with available fiscal resources to
better serve our students and their communities;
Serve society through teaching, research and creative activity, and service;
and
Develop new sources of non-tuition income and reduce costs through
improved efficiencies.
The strategic plan process at University C is often referred to as a “top-down/bottom-up”
process. The top-down characteristic is the administration’s mandate that every budget
unit must submit a strategic plan. General guidelines are distributed, asking each unit to
discuss: 1) its future directions, strategies, and measures of performance; 2) its strategic
performance indicators; and 3) an internal recycling plan that describes how the unit will
recycle 1% of its operating budget each year and redirect those funds to the highest
priorities. The bottom-up characteristic stems from the fact that the university-level plan
110
is only finalized once the unit plans have been reviewed. Each goal of the university-level
plan therefore has multiple strategies shaped by the strategic plans from the various
budget units. After the unit plans have been submitted, they are read, with an eye toward
identifying common themes, which in turn influence the way in which the university-
level plan is composed.
In addition to the university’s strategic plan, University C pays close attention to a
set of “strategic performance indicators.” The same set of indicators has been tracked for
several years now, and the university publishes them in a companion document to the
university-level strategic plan.
Sources of Evidence and Logic Model
In order to study University C’s strategic plan, several sources of materials were
obtained and reviewed, including the following public documents: news releases,
speeches, newsletters, journal articles, the strategic plan, the annual updates, and the
annual strategic indicators reports. University websites were likewise taken into careful
account, including those for the strategic plan, the Board of Trustees, and the Offices of
the President, Executive Vice President, Provost, Planning and Institutional Assessment,
and Finance and Budget (see Appendix H). In addition, the Strategic Planning Evaluation
Survey was e-mailed to twenty-six upper-level administrators and deans at University C.
Of the nine that responded (making for a 35% response rate), five stated that they were in
administrative units, two replied that they were in academic areas, and two did not
answer the open-ended question regarding the nature of employment area (see Appendix
111
M). Three upper administrators from the Offices of the Senior Vice President for Finance
and Business, as well as Planning and Institutional Assessment, were interviewed on the
campus of University C (see Appendix I). The executive vice president and provost
declined to be interviewed stating that the executive director of the Office of Planning
and Institutional Assessment would be more appropriate for the interview.
For University C, the logic model evidence matrix presented below provides a
visual reference of each of the research sources by components (see Figure 4.4) (see
Appendices J and M).
112
Sources of Evidence Document
Analysis Survey Results Interview
Results Input
Reflections made on the use of resources, including human, financial, organizational, and community that have been invested into the strategic plan initiative.
6 references 4 items with responses that ranged from 78% to 100% for somewhat agree or strongly agree
5 responses
Activities
Review of the processes and actions that were an intentional part of the implementation, such as tools, technology, and services.
11 references 6 question with responses that ranged from 78% to 100% for agree or strongly agree
12 responses
Output
Whether the evaluation methodology included gathering data or results on the activities.
26 references 3 question with responses that ranged from 89% to 100% for agree or strongly agree
8 responses
Outcomes
Short-term and long-term outcomes consisting of specific changes resulting from the outputs, including attitudes, behaviors, knowledge, and/or skills.
26 references 4 question with responses that ranged from 78% to 100% for agree or strongly agree
11 responses Log
ic M
odel
Com
pone
nts
Impact
Fundamental intended or unintended changes occurring in the organization or community
15 references 7 question - responses ranged from 78% to 100% for agree or strongly agree
12 responses
Figure 4.4: Matrix of University C’s Evidence by Logic Model
113
Evaluation Methodology
Due in part to the top-down/bottom-up structure of strategic planning at
University C, the most common response among survey and interview respondents
regarding the evaluation process was that to evaluate the university plan, one really
needed to go back to the unit-level plans to determine whether articulated goals had been
accomplished. One interviewee provided this illustration:
Each strategy in this [university] plan comes from a unit-level plan. For example… “advance excellence in legal education…” was in the [law school’s] strategic plan. So… at the end of the cycle, you go back and when you read the unit-level plans again, you’ll find out if [the law school] did this or not.
The unit-level plans thus have strategic performance indicators that resonate with the six
overarching goals of the university plan. It was explained that since the university goals
are more broad than the unit goals, the strategic performance indicators are more
meaningful at the unit level. Therefore, to examine the effectiveness of the university
plan, one must review the units’ strategic performance indicators and the data directly
relating to those indicators. An administrator reiterated this by saying that “[t]he strength
of the university plan really rests within the unit-level plans and how they measure
progress toward goals.”
The ideal of this integrated approach notwithstanding, several stakeholders at
University C have described the evaluation process for the university-level strategic plan
as informal at best and nonexistent at worst. Most often, the evaluation process was
referenced primarily at the unit level. The Education Criteria for Performance Excellence
from the 2004 Baldrige National Quality Program were often cited in internal documents
to help units evaluate their plans. These evaluations were usually tied to the preparation
114
and development of the next strategic plan for the unit. In creating their next plans, the
provost’s guidelines instructed units to discuss their current internal and external
challenges that had been identified through benchmarking, stakeholders’ feedback,
market analysis, and other assessment data. In various presentations to the Board of
Trustees, units made it clear that they regularly take into consideration student interest,
faculty expertise, developments in the discipline, and societal and state needs when
reviewing their plans. Significantly, the only comment to be found in the documents
regarding the evaluation process for the university was in a speech from the president to
the Board of Trustees, in which he says that as the campus was entering the final year of
the plan, the university was “reflecting on the results.” Other than that simple statement,
the evaluation process at the university level was not mentioned in all the other
documents reviewed for this study. Yet within the survey itself, 78% of respondents
agreed that specific changes had occurred through a systematic evaluation of the plan’s
results. Based upon this information, one might say that the evaluation for the university-
level plan could be characterized as an informal process of tracking overall performance
indicators. The following quotes are representative of the comments made by the
interview participants:
We don’t have a process evaluating how well we are performing at the university level. We do have it for the unit level. Our trustees are aware of this, and they see it, they sort of know what’s in it. It’s not a formal step. We have evaluated the planning process, and we have evaluated the unit plans, but I’m not sure if we have evaluated this [university-level] plan. I think we can do a better job at systematically evaluating the plan. We are constantly looking at our strategic indicators. Based on how the plan is developed, it really is at the unit level.
115
Well, it is not the evaluation of plan so much as it is a statement of whether the goals that were outlined in the plan have been accomplished. It’s not whether it was a good plan because it had a mission, a vision, and a value statement; it’s “Did the unit do what they said what they were going to do?” That is really determined at the unit level. To be honest, I don’t know how important it is to assess at the university level, whether we are achieving these goals. They are such high-level goals. Are we aligning our missions and programs at the university level, or are we creating a civil and inclusive climate at the university level? It is so big and broad that it is hard to know.
Measures. In several documents it was stated that the strategic performance
indicators at University C were an attempt to answer the question, “How well are we
doing?” The indicators, or measures, are tracked to chart progress toward meeting the
university’s goals. In addition, each unit prepared its own strategic performance
indicators relating to both the university and unit goals. The strategic performance
indicators are analyzed every year.
All of the survey respondents agreed that University C had a formal process for
gathering and sharing internal and external data. It seemed that the process in which data
on indicators were collected, analyzed, and reported was well defined. Therefore, it is not
surprising that most interviewees mentioned the indicators as a way to assess the progress
of the plan. Even the brochure for the performance indicators stated that the measures
were a way to “carefully evaluate and improve [the university’s] performance.” One
person commented in the interview:
Well, what we would say is that we track the strategic performance indicators every year. That is how we evaluate the plan…You’ve seen the strategic performance indicators. They are mostly input and output; there are not many outcome measures. We could do a better job on that for sure.
116
Others likewise commented that even though the indicators were typically considered to
be the best way to track progress, it was recognized that other methods may exist,
although none were mentioned. For example, one individual said, “I would say at the end
of the day we are not hamstrung to say those metrics alone are the only way that we judge
how well we are doing.” Apparently some had maintained that such indicators were the
best—and perhaps only—way to measure progress. An administrator commented:
Yeah, there are people who would refute whether the metrics are really giving a true indication of whether you are making progress or not on a goal. You could debate that. That’s fair.
However, the indicators do seem to have remained the favorite method by which to track
progress, as they have now been used for almost ten years, and have since increased in
number. One stated it this way:
We have a set of strategic indicators, which evolve over time. I think there are 28 of them [in the previous report]. So what has happened with these strategic indicators, they just keep growing all the time. I think we have something like 40 now. At some point it stops being strategic. You are just measuring everything, like a university fact book.
Another commented:
I don’t think we have kept the same indicators, but we have had this approach for a long time that there will be indicators that are organized around the goal. In that way the approach has been similar even though the indicators have multiplied.
Main Evaluators. When asked in the survey who the main participants were in
the evaluation of the strategic plan, the majority (57%) responded said that it was the
upper administration. This corresponds to the comments made by the interviewees as
well:
117
The provost [and the Office of Institutional Planning and Assessment]
I would say informally the leadership of the university. But formally, I would say no, there is not a process for that.
A lot of the evaluations of the plans and other high-level decision-making depends on what the President is observing, people like that.
Perhaps, since most did not know of an evaluation process for the university plan, or
thought of it as an informal process, it was not well understood how the process was
completed and who were the main participants.
Reports. The strategic plan and the strategic indicators reports were easily
available on the web. Also, both reports were highly polished and their paper brochures
of professional quality. Yet, whether or not an annual report or presentation was given to
the Board of Trustees was not evident. The only official Board of Trustees statement
regarding the strategic plan was the approval of the plan in the minutes of the board
meeting. One administer described it this way:
We share, of course, the strategic plan with the Board – get the Board’s input as we draft the plan. The Board sees it and gives us their reaction and input to it. So the Board has some involvement and certainly awareness of the plan. But to answer the question, no, we don’t go to the Board every November with a specific report on the plan.
There were other presentation or reports that mentioned working toward the goals, such
as the State of the University address. Also, the senior vice president for finance and
business mentioned that his presentations about the budget to the Board included how the
university strategic plan had guided the budget decisions. He said “it is more linked with
118
the discussions about budget and the priorities of budget and capital than it is an isolated
report.”
Integrating Budgeting and Planning
The strategic plan does not specifically identify exact levels of investments and
what the investment elements might be. The funding is described as following the
directions laid out in the plan. There are central monies that support the strategic
initiatives. For example, since there is a major maintenance backlog, every year some is
used for maintenance.
In the past, the university had required a small percentage of the unit’s total
budgets for the central pool. Because the university plan was a conglomerate of the unit
plans, some of the central monies were also used to support unit plan initiatives. One
administrator articulated it in this way:
There were the president’s strategic initiatives. There was a central pool of money to support the life science consortium, the materials consortium, the environmental consortium, and what you would see in the plans, is that positions could be co-funded and you get money centrally for some of those positions. So there was a relation between central monies and what you saw in [units’] strategic plans.
For some time, the university had a planning council made up of deans, faculty
members, administrators, and some students. The planning council would review each
unit plan, consider the requests for resources, and recommend funding. Although this was
perceived as a powerful committee, the actual funding amounts were small in relation to
the total university budget. The executive director of planning stated it like this:
119
The other important consideration with all this, is we don’t have that much discretionary money anyhow. If people were looking to planning primarily as a way to get more money, to affect the allocation of resources, that was the wrong way to think about it. They should be thinking about writing a plan for themselves, not because it is going to have any impact on how the provost or president decides to spend the central monies.
In committee was disbanded in 2002, due to a restructuring of the review process.
Given its limited state resources, University C had to exercise internal budget
reductions and reallocations to fund strategic priorities and critical operating needs. More
recently it has changed this practice to ask each unit how it would internally “recycle” at
least one percent of is operating budget each year. Units are no longer required to
reallocate funds for the central pool. Guidelines to the units stated that the internal
reallocation would continue for five years. However, given the lack of predictability of
state funding of higher education, there could be a need to have central reallocation in
some years.
In the budget presentations to the Board, the budget was referenced to the
strategic plan. Beginning in 1992, there was a deliberate process of budget recycling and
reallocation incorporated in the strategic planning process. Since that time, the
university’s budget priorities have reallocated funds from its administrative and support
activities to the university’s goals. In a news release in 2000, the provost stated:
Budgetary information is factored into the planning process to provide a realistic context for proposed changes and initiatives, and strategic plans are taken into account directly in budget allocation decision.
By the time that news release came out in 2000, the University was in its ninth
consecutive year of internal budget reallocations and that practice continues today.
120
Culture and Leadership
The provost, in a 2001 news release, stated that over the years strategic planning
had become very much a part the university’s culture. “I’m pleased with the attention and
care that the university community has given to this process and I’m proud of the
outcomes we’re seeing.” By practicing strategic planning for so many years, strategic
planning had become part of the university, as one of its customs. One person described
the situation this way:
There is definitely a culture of planning here. I’ve been around here for 26 years, so I was here before it all started. I can remember at the beginning there was a lot of skepticism “isn’t this one more management fad” or that they would wait this out. Or people didn’t really understand it. That is just not the case anymore. People know that this is just how we manage the university. Like it or not, they are going to be involved in planning. That is just the way it happens, and partly this has been the experience people have gone through, it is like a generational thing. A lot of people who were here 25 years ago aren’t here anymore. Also, with the people that came in [after that] this is all they have ever known (at this institution). It is the way we do it.
However, it is more likely that the reason strategic planning has become such a way of
life at University C is because its upper administration has had such a long history with
the university. With the university leadership the driving force behind strategic planning,
it had become a tradition. An administrator explained it this way:
It is in the culture at [University C]. [The senior vice president for finance and business] has been here about 30 years. [The provost] has been here 30 years. [The president] started here as an assistant professor, then became an associate dean, left and came back. He has been here about 10 or 12 years. What you got is this culture of leadership … I mean they know the institution so well.
121
Linking the Strategic Plan to Other Processes
Participants in this study often mentioned that the success of the plan was evident
by the way it had been incorporated into other processes. Two examples of linking the
strategic plan to the other processes were the performance appraisal system and
professional development opportunities. One described the relationship as follows:
That may seem a bit circuitous but if you are looking at the plan, one of the ways you can tell if a plan has been implemented, is you look at the training and development opportunities for people, and whether they are congruent with what we say is important in the strategic plan, you look at how peoples’ performance is evaluated, [and ask] is that congruent with what we say is important in the plan?
In an effort to determine if the university was truly following the strategic plan, a survey
was distributed to the faculty and staff every three years to see if they believed that the
strategic plan values were in fact practiced at the university. By linking the plan and its
implementation to each individual’s performance appraisal and professional
development, it can be said that it was integrated throughout the university.
Length of the Planning Cycle
Since University C had such a long history with strategic planning, it had tried
different variations of planning including the time length of the plan. For a while, it did
strategic planning on a five-year cycle. However five years was considered too long
given the environmental changes that could happen during that time period. Therefore, it
moved to a three-year cycle believing that the shorter planning cycle was probably more
realistic. Yet, it was mentioned during the interviews that the three-year cycle seemed to
122
be too short and required constantly planning. There was some disagreement as to how
long the next strategic plan would be. Both five-years and three-years were mentioned.
In sum, at University C strategic planning had become a continuous process, a
routine, and, over time, units had grown quite proficient at planning. However, the
university level plan seemed to be still evolving and maturing. Even though a systematic
and periodic evaluation of the university plan was not evident, their top-down/bottom-up
process created an unofficial assessment of the plan. This was evident from references
stated in the documents reviewed and responses given in the survey and interviews.
Though periodic reviews at the University level may have been informal, each
component of the logic model was engaged.
University D
As the fourth and final institution under consideration here, University D likewise
fits the parameters of the study’s cohort. Enrolling almost 30,000 students and operating
with an annual budget of just over $1.2 billion, this leading public university put its first
strategic plan into place in July 2003. The plan was generally academic in nature, and
was one plan among many ongoing initiatives at the university. It enumerated six goals:
Provide the strongest possible academic experience at all levels;
Integrate interdisciplinary research, teaching and public service;
Improve faculty recruitment, retention, and development;
Increase diversity of faculty, students, and staff;
Enhance public engagement; and
123
Extend global presence, research and teaching.
The strategic plan included more than seventy-five “action steps” and over thirty
“illustrative benchmarks.” Within the plan, each action step included a listing of the
administrative or academic unit responsible for overseeing implementation. In
conjunction with the strategic plan, another report on the “Measures of Excellence”
provided additional data points to help track progress. University D was in the final year
of this strategic plan during the course of the present study.
The strategic plan had a five-year timeframe with an interconnected framework to
all of the unit levels. Each year the provost issued guidelines to the deans and vice
chancellors setting forth expectations for their required annual planning and budget
reports. One of these expectations was a description of how the unit had addressed the
strategic plan’s goals in the course of the previous year.
Sources of Evidence and Logic Model
Similar to the other case studies, the following public documents were examined
for University D: the strategic plan, annual updates, data reports, news releases, meeting
minutes, presentations, and speeches. The websites for the following offices were
likewise consulted: the strategic plan, the Board of Trustees, the president, the executive
vice chancellor and provost, the Office of Institutional Research and Assessment, the
Office of Financial Planning and Budgets, as well as the university’s accreditation
website (see Appendix H). The Strategic Planning Evaluation Survey was distributed to
thirty-nine individuals that were either on the academic planning task force, or who were
124
executive staff, deans, or management staff in institutional research. The survey netted
nine responses, making for a 23% response rate (see Appendix N). Of these nine
responses, four were from administrative units, three were from academic areas, while
one opted not to answer that particular question. The executive associate provost and the
interim director for financial planning and budgets were both interviewed at University D
(see Appendix I). Figure 4.5 below documents the research sources by the logic model
components for University D (see Appendices J and N).
125
Sources of Evidence Document
Analysis Survey Results Interview
Results Input
Reflections made on the use of resources, including human, financial, organizational, and community that have been invested into the strategic plan initiative.
3 references 4 items with responses that ranged from 78% to 89% for somewhat agree or strongly agree
2 responses
Activities
Review of the processes and actions that were an intentional part of the implementation, such as tools, technology, and services.
13 references 6 question with responses that ranged from 55% to 100% for agree or strongly agree
3 responses
Output
Whether the evaluation methodology included gathering data or results on the activities.
30 references 3 question with responses that ranged from 44% to 100% for agree or strongly agree
5 responses
Outcomes
Short-term and long-term outcomes consisting of specific changes resulting from the outputs, including attitudes, behaviors, knowledge, and/or skills.
20 references 4 question with responses that ranged from 56% to 100% for agree or strongly agree
6 responses Log
ic M
odel
Com
pone
nts
Impact
Fundamental intended or unintended changes occurring in the organization or community
5 references 7 question - responses ranged from 44% to 100% for agree or strongly agree
6 responses
Figure 4.5: Matrix of University D’s Evidence by Logic Model
126
Some degree of difficulty was encountered in obtaining interviews and survey
participants at University D. Possibly contributing to this may have been the fact that
unlike the other institutions, the researcher did not have a main campus contact at
University D to help facilitate such interactions. Without this main contact, obtaining
participants and scheduling interviews was somewhat more complicated, in that it
required much more detailed explanation as well as repeated requests. The limited
accessibility of the interviewees did not lend itself to a campus visit, which meant that
interviews had to be conducted over the telephone. In addition, finding participants
willing to be interviewed proved challenging. The executive vice chancellor and provost,
for instance, declined invitations, as did both the assistant provost for institutional
research and assessment and the director of reporting in the Office of Institutional
Research and Assessment. Interestingly, all of these figures simply said that the executive
associate provost would be the best person to speak to regarding the strategic plan. When
asked in his interview whom else he could recommend, the executive associate provost
explained that there had been quite a bit of turnover, making it hard to refer anyone else
for an interview, as most others in key positions were simply too new to be able to make
informed comments on the university’s strategic planning process.
Another possible factor accounting for the low level of interest in speaking about
the strategic plan may be that the strategic plan itself does not enjoy a high profile at
University D. Indeed, one of the interviewees even mentioned that the university should
have done a better job communicating and promoting the strategic plan by demonstrating
how intricately it was bound up with daily management decisions. This was also reflected
in the public documents reviewed, most of which simply incorporated the strategic plan
127
into long lists of initiatives currently being undertaken by the university. For example,
during the same time period covered by the strategic plan, the university was also
pursuing a major development campaign, a financial plan, a project to increase state
resident financial aid, an accreditation “enhancement” plan, a business efficiency project,
and an improved workplace task force, just to name a few. And whereas in the other case
studies these types of initiatives were usually explicitly tied to the strategic plan, such a
connection was not as evident at University D. In fact, rather than the strategic plan being
the driving force behind these initiatives, it seems to have been thought of more as a
parallel activity—hence, perhaps, the lower interest level and response rate for the
present study. Nevertheless, even if one might have wished for a higher quantity of such
information, this concern is alleviated by the high quality of the sources that were
available.
Evaluation Methodology
Although 44% of those who responded to the survey agreed that University D had
a systematic process of strategic planning, one-third (33%) of respondents actively
disagreed with that statement. In the document review, most references to the strategic
plan were related to the data. Less often cited were the specific changes or reflections of
the plan’s actual effectiveness. Although it had illustrative benchmarks for its six goals,
the emphasis was on the several action steps (or strategies) for each goal. Within each
action step was a series of “recommendations” that named a responsible office. The
128
benchmarks provided were described as “examples” that could help measure
improvement.
Measures. Most often, measures were referred to when discussing the assessment
of the plan. Eight of the nine survey respondents agreed that data collection and analysis
were an integral part of the institution’s strategic planning process. The strategic plan had
benchmarks, and these were listed in a companion report called the “Measures of
Excellence.” The Board of Trustees decided to develop this report at about the same time
the strategic plan was implemented, in an effort to start obtaining data to track progress.
As noted in the Board minutes, these measures serve as “indicators of accomplishments
and quality” in education; faculty; staff; public service and engagement;
internationalization; finance and facilities; and athletics. Ten peer institutions were
identified as a reference point for a number of the measures. The Measures of Excellence
report thus tracks data to monitor the university’s performance on its strategic plan goals,
as well as additional data for other initiatives and activities. Some of the measures also
serve as illustrative benchmarks for the strategic plan. One administrator described it in
this way:
These are not tied to a specific strategic plan, but are indicators that they came up with (about forty) about how [University D] is doing. There is some overlap. For example, both the [strategic plan] and the Measures of Excellence look at undergraduate retention and graduation rates. But there are some things there in the Measures of Excellence that go beyond the academic side that have to do with infrastructure, capital, budgeting, with development of research facilities.
129
There were other factors besides such measures that played a role in the evaluation
process, including “unique opportunities.” These were described as events not necessarily
anticipated at the creation of the strategic plan, but ones nonetheless related to the spirit
of the plan. For example, the plan has a long-term goal of increasing the research
capability of the university. Recently, the chancellor decided that the goal for sponsored
research would be to reach a billion dollars per year. Although that goal is not written
anywhere in the plan, it is consistent with the plan, even if not expressly stated. An
interviewee went on to explain this approach in more detail:
They have to do with the changing economic situation and also with just the unpredictable nature of things that happen… Yeah, I think that [it has] to have a little wiggle-room. It’s not going to be enough to say that it’s not in the plan, so we can’t do it. There’s always going to be opportunities that come up.
Main Evaluators. In the survey, just slightly more than half (55%) agreed that
faculty, staff, and administrators were actively involved in the strategic planning process.
In fact, the strategic plan was rarely referenced in unit-level websites and documents.
One interviewee said that “to a lesser extent, the faculty [are participants], only because
we publicize it, put it on our webpage, and talk about it now and then.” The participants
in interviews expressed opinions consistent with this attitude. The only main participants
in the evaluation who were mentioned were the executive associate provost, the provost,
the chancellor, and the Board of Trustees. Most respondents to an open-ended question in
the survey said that just the top administrators were involved; only two respondents
stated that the strategic plan was a campus-wide effort.
130
It should be noted that the main architect of the plan, the provost, had left
University D after three years to become the president of another institution. Perhaps the
provost had also served as main driver of the plan, and his absence resulted in the lack of
a unified understanding of the evaluation of plan. However, as his successor was an
internal hire that had been involved in the development, that person’s comprehensive
understanding of the plan no doubt played a key part in the plan’s subsequent evaluation.
Reports. Each year a progress report was presented to the Board of Trustees.
Annual updates of the Measures of Excellence were also presented to the Board at the
same time. These presentations were only made available in an electronic format on the
Board of Trustees website; published paper reports or data reports were not distributed to
the campus or the community. Each year the chancellor would give a “state of the
university” address that occasionally mentioned the strategic plan, but, notably, none of
these addresses expressly highlighted it. Progress on the strategic plan was basically
directed to the Board only. As one interviewee put it:
So those are really the two things that we use -- the annual update or benchmarks on the five-year [strategic] plan and the annual Measures of Excellence, both of which are done in the form of reports to the Board of Trustees with presentations.
Apparently, then, this style of reporting proved ineffective in informing the campus and
community of the scope and purpose of the institution’s strategic plan. The Board of
Trustees may have been well informed, but the rest of the stakeholders were not. This
state of affairs was clearly borne out in the difficulty encountered when trying to get
people to talk about the strategic plan at University D. Specifically, those individuals that
131
declined to be interviewed all stated that they did not have the sufficient knowledge to
respond to this research study.
Integrating Budgeting and Planning
From the beginning, University D’s strategic plan was developed to further refine
the planning and budget process. One early draft stated that the plan would serve as a
primary guide for resource allocation. It was expressly noted in several documents that
the strategic plan would shape budget decisions. One news article similarly stated,
“While the plan was shaped by many different hands, it is bound together in a common
purpose shared by all: to tie budget decisions made each year to the University’s overall
goal of becoming the country’s leading public university.” The chancellor reaffirmed this
view in a news release, proclaiming that this was indeed one of the purposes of the plan.
A news article the campus paper described it this way:
Considering the fact that this is the first [strategic] plan that has ever been done at [University D], it’s reasonable to assume not everyone knows what it is supposed to do, or more important, appreciates the central role it will play on campus through most of the decade. It is about money and mission and how best to connect the two in the allocation of limited resources over the next five years. And yet, when promoting the strategic plan, the provost reported in a news article
that the campus community should view the plan beyond just the implications for budget
allocations for academic priorities. To that end, he added, the strategic plan was being
implemented in tandem with a five-year financial plan developed by the Office of
Finance and Administration. The financial plan was described as a framework to inform
the campus about university-wide goals. This financial plan also would provide an
132
alignment of resources to meet the highest priorities in future budgets. The financial plan
identified funding options, such as private gifts and endowment income, student tuition
and fees, enrollment-growth dollars allocated by the state, and monies generated by
research. While admitting that it was impossible to know for certain that the funding
would occur as predicted, the provost believed the assumptions were conservative. As it
turned out, however, the funds received did not match expectations, leading to notably
mixed results in the area of funding the plan. Some specific examples were given in the
case of fundraising. For example, the plan called for increasing the honors program, but
due to inadequate monies raised, that goal was unable to be met. In other cases, though,
the opposite proved to be the case. For example:
We got a $5 million gift from a donor, and there was a [strategic] plan initiative about faculty development. We took the $5 million—it was generally stated to support faculty—and used it specifically to implement some goals that were set forth in the [strategic] plan for faculty development.
In a state of the university address, the chancellor said, “The Board of Trustees was
strong in its determination that we really put our money where our mouth is—that we are
clear and direct in acquiring and moving resources to support our highest priorities.”
Therefore, the university-level strategic plan instructed the units to include resource
allocation plans with their strategic plans. These reallocations would then be used to
support targeted strategic plan initiatives. The provost was quoted in a news article as
having remarked, “In the annual budget process, I ask each dean, vice chancellor and
director to address the six [goals] of the [strategic] plan and relate these to their budget
requests.” A planning task force was developed to manage both the solicitation and
evaluation of the unit proposals. It was the planning task force’s responsibility to decide
133
which of the proposals should be referred and recommended for funding, based upon the
innovation of each proposal and how compatible it was with the goals of the strategic
plan. Through an annual budget and planning process, each of the deans and vice-
chancellors then met with the provost, executive associate provost, the associate provost
for finance, and the vice chancellor for finance and administration. The template for the
unit report included the relationship of each unit’s plan to the six goals of the university
strategic plan, the steps needed to be taken in order to accomplish stated goals, and the
budget requests needed to help implement them. Some units were successful in their
requests and duly received funding; others were not. In his interview, the executive
associate provost observed that, “Conversely, programs or units that don’t fit directly
with the[strategic] plan and don’t seem to address it, we actually cut their budgets.”
One very important result of integrating the strategic plan with budget requests
was an increased understanding across the university community with regard to budget
decisions. The following comments were made during the interviews:
A specific change that has occurred is a much greater transparency and understanding of the budget process. That’s really important. It used to be that it was like a black box: requests would go in, and stuff would come out, and nobody would know how the decisions were made. I think one of the best things [the provost] was able to accomplish was that—even in a time of budget cuts—people understood why cuts were made and the criteria used. People understand the budget process because it’s open, it’s transparent, we have criteria. We say, “Here’s how we’re going to make these decisions.”
Interviewees also mentioned that increasing the understanding of the budget decisions
served to improve the relationship between the administration and faculty. Moreover, by
tying the strategic plan and the budget together, the odds were greatly increased that the
plan would actually be implemented and used:
134
The concern that [the provost] had was that [it] would just sit on the shelf. And I think the way we’ve avoided that is by tying it, making it an integral part of the budget and planning process, the annual budget process. So that was really important.
In sum, the greatest result of the plan seems to have been the integration of planning with
budgeting, and how that development has facilitated an increased awareness of the
overall process.
Communication
One of the explicit goals of University D’s strategic plan was to foster
communication across the campus, and particularly to encourage campus constituents to
engage in an ongoing dialogue in order to ensure the plan’s steady progress and to
monitor any new developments that might impact it. Despite this desire, however, it is
clear that communication efforts foundered, resulting in modest references of the plan in
public documents and even, perhaps, in the unwillingness of faculty and staff to
participate in this study. The lack of reports and publicity did little to engage the campus
community. As one interviewee observed:
I don’t think we did enough to communicate to the students and to the faculty on a regular basis about how strategic planning was being used to shape the campus on kind of a day-in, day-out basis. I think we do a really good job of communicating that up to the Board of Trustees. We will get little articles in the in-house newsletter, or the student newsletter, but not enough. That’s something we should have done more.
In fact, an equal number of survey respondents (44%, N=4) both agreed and disagreed
that strategic planning played an important role in the success of the institution. In
addition, two people even agreed with the statement that “the strategic planning activities
135
did little to help the institution to fulfill its mission.” None of the other universities’
survey responses in this research were so demonstrably divided. The lack of
communication may therefore have contributed significantly to the lack of a unified
understanding of the plan and its outcomes.
In sum, strategic planning was new to University D. Even though it was in the last
year of the plan, the plan had only begun four years earlier. Poor communication, limited
public presentations, and the fact that it had to compete with other initiatives all seem to
have factored into the participants’ nebulous understanding of the plan’s evaluation
process. But even if there is no evidence thus far of a systematic evaluation of the
strategic plan at University D, a strategic plan initiative—as it evolves and matures in
coming years—will most likely continue to have a major presence at the university.
Cross Case Analysis
The following provides an analysis of all four cases. First, the results are
examined within the logic model components. Second, the key themes that emerged from
the cases are discussed.
Summary and Discussion of the Results within a Logic Model
Data from each of the four case studies were evaluated and sorted into the five
components of a logic model. These components are: input, activities, outputs,
outcomes, and impact. Both the survey and interview questions were linked to the logic
136
model components, thereby facilitating the coding of the results of the research. The
documents were also reviewed and coded into the appropriate logic model component.
The data were then tallied for each component to show how strong or weak the research
sources were for each one of the logic model components. For each case study institution,
a logic model evidence matrix was developed. The following is a comparison of the four
cases using the evaluation matrix (see Figure 4.6).
137
Cases University A University B University C University D
Input
19 references/ and survey responses ranging from 72% to 100% in agreement
17 references/ and survey responses ranging from 67% to 100% in agreement
11 references/ and survey responses ranging from 78% to 100% in agreement
5 references/ and survey responses ranging from 78% to 89% in agreement
Activities
13 references/ and survey responses ranging from 77% to 94% in agreement
26 references/ and survey responses ranging from 55% to 100% in agreement
23 references/ and survey responses ranging from 78% to 100% in agreement
16 references/ and survey responses ranging from 55% to 100% in agreement
Output
23 references/ and survey responses ranging from 83% to 94% in agreement
45 references/ and survey responses ranging from 78% to 89% in agreement
32 references/ and survey responses ranging from 89% to 100% in agreement
35 references/ and survey responses ranging from 44% to 94% in agreement
Outcomes
40 references/ and survey responses ranging from 83% to 94% in agreement
28 references/ and survey responses ranging from 78% to 100% in agreement
37 references/ and survey responses ranging from 78% to 100% in agreement
26 references/ and survey responses ranging from 56% to 94% in agreement
Log
ic M
odel
Com
pone
nts
Impact
30 references/ and survey responses ranging from 83% to 94% in agreement
21 references/ and survey responses ranging from 89% to 100% in agreement
27 references/ and survey responses ranging from 78% to 100% in agreement
11 references/ and survey responses ranging from 44% to 100% in agreement
Figure 4.6: Comparison Matrix of Each Institution’s Evidence by Logic Model
The logic model provides an inventory of the resources and activities that lead to the
relevant results, thus serving as a method for assessment. By monitoring an entire
university’s evaluation process, the logic model can help determine whether the
138
institution has assessed all aspects of its strategic plan. It also identifies areas of strength
and/or weakness.
A basic comparison of the institutions’ results can be made by reviewing the
amount of materials, references, statements from documents, open-ended survey
questions, and interviews for each logic model component (see Figure 4.7).
0
10
20
30
40
50
Input Activities Output Outcomes Impact
Logic Model
No.
of R
efer
ence
s
ABCD
Figure 4.7: Universities’ Results Charted on the Logic Model
For example, three of the four institutions had the least amount of references in the input
component. This demonstrates the lack of emphasis in the evaluation for reflections made
on relational issues invested in the strategic plan initiative and the appropriateness of the
mission and vision. Similarly, a comparison within the activity component shows that
there were more references than with the input component. However, the amount of data
for the activities component was limited. Assessments of the processes or actions of the
strategic plan initiative (as categorized by the activities component) were occasionally
referred to in reviews or modifications of internal business practices. When reviewing the
distribution of the references within the logic model, the impact component is a middle
139
value. Although evaluating the impact of a strategic plan seems like it would be the
primary factor, for the four institutions studied it was not. Perhaps the evaluations of the
strategic plans were too young, and more focused on the present to allow for or
encourage reflections made on long-term consequences. The major emphasis in the
evaluations in these four case studies was in either the output or outcomes components of
the logic model. The interview participants at these institutions certainly had little
difficulty in rattling off the progress made on the measures and emphasizing the
immediate outcomes of those constructive numbers. Reports on the performance
measures were typical and the publication materials and presentations gave specific
examples of changes that had occurred at the institutions because of their respective
strategic plans.
When comparing the survey results of the four case studies some differences
emerged among the cases in the levels of agreement to the survey questions. For
instance, University D had the lowest amount of agreement with the survey questions in
all of the logic components, except for the questions linked to the input component. (For
these input questions, University D tied with University C in having the highest level of
agreement to the survey questions.) The implication of the lower agreement levels of
University C demonstrates the lack of conformity of the participants and indicates that
the evaluation process is not as developed as it is at the other three institutions. By
contrast, University B has the highest level of agreement to the survey questions. This is
not surprising, however, given the fact that strategic planning had been part of this
institution’s culture for many years, and that most of the survey participants had a long
history with the institution.
140
In comparing the four institutions, University D had the least amount of materials,
references, and statements; yet it followed the same pattern of distribution across the
logic model as the other institutions. The smaller amount of University D’s references
were most likely due to the fact that it had started its strategic plan four years earlier and
therefore had the least amount of time to evaluate its plan. Conversely, University B had
the largest amount of references for its evaluation process and followed the same pattern
of distribution. Using the logic model as a template in the review has thus provided a
more comprehensive overview of the evaluation process of the strategic plans.
Summary and Discussion of Common Themes
The results of the data analysis showed that although each case was distinctive,
there were common evaluation methodologies and themes among all of the institutions.
The following section describes and analyzes these common methodologies and themes.
Measures. The prevalent evaluation tool for all of the institutions was a heavy
reliance on measures. These performance measures were most often cited as evidence of
their evaluation. Obviously, the use of measures is an effective and demonstrative way to
show change. Yet more often than not, these measures were quantitative in nature; less
emphasis was given to qualitative data. Perhaps with today’s increasing interest among
higher education institutions in using “dashboards” (visual representations of the most
important performance measures), qualitative data does not lend itself to this format.
141
Omitting qualitative data from the measures, however, amounts to a missed opportunity
to illustrate other important aspects of the institution. It is important to have performance
measures, but the measures themselves are not the only way to evaluate a plan.
Main Evaluators. Another aspect of the evaluation methodology common to all
institutions studied was a strategic plan committee. In each case, the committee was only
formed for the creation of the plan and not responsible for—nor participated in—the
evaluation of the plan. The one slight exception to this was University A. Not only had
the university initially convened a task force for the creation of the strategic plan, it had
also assembled a second committee, called the Strategic Plan Review Committee, in the
beginning of the final year of the plan. The strategic plan review committee’s sole
purpose was to focus on providing a retrospective view of the effectiveness of the plan
and a prospective view of the priorities and characteristics to be considered for the next
plan. In addition, University A included community visits as another evaluation
technique. These community visits were another tool in the methodology to get feedback
from the stakeholders in the city, region, and state. University A thus recognized the
importance of the evaluation process including representatives from across the campus as
well other the stakeholders, as a vital addition to the upper administration itself. The
evaluation should include some of the people that developed the plan as well. This
representational evaluation can be achieved in various manners, such as though a review
committee or task force, surveys, focus groups, or open forums, to name just a few.
Asking stakeholders for their reflections, thoughts, or satisfaction on the progress made
142
toward stated goals creates a discussion of strengths and weaknesses. The importance
was generally recognized of making this inclusive and part of the strategic plan process.
That being said, it is still nevertheless true that the main evaluators identified for
each institution were for the most part members of the upper administration. All levels of
the universities and other stakeholders may have been involved with the development
and/or implementation of the plan, but may not necessarily have been consulted in its
evaluation. Only University A included within its evaluation a review committee and the
president’s community visits (i.e., visits to regional cities, towns, and businesses to
communicate the progress made on the strategic plan). But even so, these were in
addition to the review by upper administrators. The upper administration and Board of
Trustees were generally the primary or principal evaluators of the institutions’ strategic
plans.
Communication. The most common way of communicating the evaluation of the
strategic plan was the tracking of measures at each institution. These results were
normally disseminated through publications, most notably annual reports. In each case an
annual report was indeed disseminated, although with variations among institutions with
regard to intended audiences and distribution. For example, University A presented its
annual report to the Board and distributed it to stakeholders. It was widely available
throughout the campus and to the community. By contrast, University D’s annual
presentation was primarily for its Board of Trustees and was not circulated beyond the
Board. Also, the presidents at Universities A and B both included in their public talks
143
progress made on the strategic plans. In contrast, the presidents at Universities C and D
mentioned the strategic plans more in passing in their addresses, not particularly
highlighting them as the impetus for change. Thus, even though the institutions saw the
value in communication, their approaches to it were very different. These various
communication styles assisted in the different results.
Culture. In three of the four cases, participants stated that strategic planning had
become ingrained into the culture. At Universities A, B, and C, participants often cited
how the culture had changed due to the adoption of a strategic plan, and how the
institution’s employees had shifted their attitudes and behaviors, and moved towards
accepting the plan and working toward a shared vision. “Data-driven” and “strategic
priorities” were common phrases used by these participants to describe the new way of
thinking at their institutions.
On the contrary, University D’s strategic plan had been introduced four years
earlier, and its influence on the culture was never mentioned. It may have been that the
strategic plan at University D had not had sufficient time to become ingrained into the
culture of the institution. Or it could be that since the strategic plan was rarely highlighted
in communications (at least in comparison to the other institutions), it did not become as
prevalent in the conversations and customs at large. At this institution, the strategic plan
had a lower profile and did not seem to generate very much attention or recognition of its
wider importance.
144
As discussed previously in this chapter, the culture of strategic planning can
influence its effectiveness, particularly in encouraging participation and collaboration.
Using elements of planning such as implementation, resource allocation, and the
integration of budgeting and planning, a new mindset can be created, becoming pervasive
across the campus. Establishing this culture of supportive cooperation for strategic
planning may thereby enhance the likelihood that a meaningful evaluation of the plan
will be carried out.
Leadership. Leadership was frequently cited as having played a major role in the
evaluation of the strategic plan at some of the institutions. For example, although
University B underwent many changes in leadership, it was nevertheless consistently able
to carry on with its strategic plan. In fact, several participants noted that the new leaders
purposely continued to adhere to (and in some cases even enhance) the strategic plan.
With their support and guidance, the strategic plan therefore not only survived, but also
adapted to a changing environment. Even though the evaluation was frequently
characterized as “informal” at University B, such supportive leadership was considered a
key factor in its overall success.
Leadership was mentioned at University C as well, even though it had a very
different situation. University C has had the same set of leaders for many years. The
participants stated that their strategic planning process and the evaluation of the plans
(both historically and currently) were strengthened by the consistency of the leadership.
Having well-established and erudite leadership in strategic planning created a favorable
145
environment in which the plan could be evaluated. One participant stated that because the
president was so knowledgeable about the plan, he easily drew talking points from it for
his public speeches, and that he and other leaders had a “pretty good sense of whether
people [were] doing what they said they would be doing or not.” Although the evaluation
methodology was not considered systematic, there was an understanding that it was
accomplished, even if casually.
Integration of Budgeting and Planning. Another important (but often
overlooked) aspect of the evaluation of the strategic plan is the review of the internal
business processes, such as staff resources, operations, activities, and functions. The
goals and objectives of the strategic plan articulate the core values, the mission and
vision. Therefore, the activities stemming from that plan should be linked to other
important organizational processes, such as budgeting and professional development
activities (Dooris & Sandmeyer, 2006). That is, the activities should reflect the core
values of the institution. For example, linking the budget process to the strategic plan is
essential. Effective resource allocation includes top-down guidance informed by bottom-
up knowledge and realities. Effective resource allocation applies measures consistently
and is able to respond to analysis of those measures to make monetary adjustments if
necessary. Incremental (or across the board) budgeting generally does not take into
account the priorities of the institution as described in the strategic plan, so if those
priorities are not funded, then the institution will not be able to sustain the plan, no matter
how inclusive it is.
146
The integration of budgets and strategic planning was discussed at each of the
four campuses under consideration. All of the universities recognized that in order for
their strategic plans to be successful, the costs of those plans needed to be estimated up
front, and the plans needed to be funded. At each institution, units had to demonstrate
how they had first internally reallocated funds and how their initiatives had advanced the
university’s strategic goals in order to get additional funding from the university. One of
the greatest benefits of the strategic plan for each institution was the integration of the
budgeting and planning processes, and how that integration served to increase awareness
of the values emphasized by the strategic plan itself.
A Conceptual Model for Evaluating a Strategic Plan Initiative
Even though strategic plans in higher education are as unique as the institutions
they reflect, there are basic evaluation elements that are generally applicable to all
institutions’ strategic planning processes. The exemplar characteristics from the findings
help create the conceptual model. There are five elements of evaluation in the conceptual
model. These elements provide the basis of an evaluation methodology that an institution
can apply to its strategic plan initiative. This conceptual model is organized according to
five elements, and prompts questions that fit into the larger framework or template. These
example questions provide institutions with the opportunity to explore and evaluate the
various components or processes within a strategic plan initiative. The conceptual
evaluation model is provided below in two tables. The first figure lists the first evaluation
147
elements that target the resources needed (see Figure 4.8). The second figure provides the
last three elements that access the intended results of the evaluation (see Figure 4.9).
Evaluation Element Interrelated with Logic Model
Sample Questions
First element. A summarization of the mission and vision. Exploration of the relational issues of influences and resources, such as the economic, social, or political environment of the community, and the appropriateness and fit of the vision. These types of questions help explain some of the effect of unanticipated and external influences.
Input Internal. “Have expectations or opportunities in learning, research, or service been considered?” External. “Have the changes in the local, state, national, or global environment been considered?”
Eva
luat
ion
of R
esou
rces
Nee
ded
Second element. A review of internal business processes, such as staff resources, operations, activities, and functions. Questions that ask about the extent to which actions were executed as planned.
Activities Internal. “Do the internal business processes, such as budgeting or professional development, articulate the values and goals of the plan? Can they demonstrate a link to the strategic plan?” External. “Did the resource development and allocation have enough flexibility to adapt to the conditions yet meet the needs?”
Figure 4.8: The Conceptual Evaluation Model - Evaluation of Resources Needed
148
Evaluation Element Interrelated
with Logic Model
Sample Questions
Third element. An analysis of the data; goals and objectives in measurable units, which include the internal and external measures. The measures should include financial, equipment and facilities, along with the learning, research, and service data.
Output If targets were set for the measures, the key question is if those targets were met.
Internal. Measures that track fundamental data on the institution.
External. Measures that provide comparisons of fundamental data among similar institutions.
Fourth element. An assessment of the consequence, value added, the outcomes, and/or the effectiveness of the plan. These questions try to document the changes that occur at the institution as an impact of the plan.
Outcomes and Impact
Internal. “To what extent is progress being made toward the desired outcomes?” “Has a culture of planning developed or evolved?” or “Does the leadership support the core principles?”
External. “Are the outcomes and impact of the strategic plan recognized by external agencies?”
Eva
luat
ion
of In
tend
ed R
esul
ts
Fifth element. Stakeholder participation in the evaluation by providing their perceptions of the effectiveness of the strategic plan.
Impact Internal. This would include involvement from stakeholders - students, staff, faculty, regional and state leaders, as well as input from other organizations, by asking about their perceptions of the effectiveness of the plan.
External. This would include perceptions from external sources that could provide a validation, such as perceptions of university presidents around the world.
Figure 4.9: The Conceptual Evaluation Model - Evaluation of Intended Results
149
This conceptual evaluation model is closely integrated with the logic model. The
logic model provides a succinct framework for evaluating programs or processes. The
conceptual evaluation model uses the logic model components (inputs, activities, outputs,
outcomes and impact) as a base to assist the unification and comprehensiveness of the
model. Resources that are invested into the program or process, such as the human,
financial, organizational, and community, are inputs. The tools, services, or actions make
up the activities component. The results of the activities are the outputs while the specific
changes in attitudes, behaviors, knowledge, and skill arising from the process are the
outcomes. The fundamental intended or unintended change occurring in the organization,
community, or system because of the program or process is the impact component of the
for this deficiency may be that no strategic planning evaluation model has been
specifically proposed for higher education institutions. Yet by performing an evaluation
of the strategic plan, a university may well be able to correlate its performance to
established purposes. After all, the most valuable aspects of evaluation as an integral part
of strategic planning include the clarity it brings to goals and objectives, the
encouragement it gives to taking a systematic look at issues, and the emphasis it places
on institutional improvement.
This study has aimed to expand the knowledge of how strategic plans are actually
evaluated, and in doing so, to contribute to the development of the conceptual evaluation
model. As the first study to examine evaluation methodologies for university-wide
strategic plans, this dissertation provides interested parties with a meaningful foundation
161
in an area vital to the current and future success of universities and colleges. But even
though the conceptual evaluation model appears to be transferable to all types of higher
education institutions, additional case study research is essential to validate or disprove
the conceptual evaluation model developed in this study. Expanding the research to
include other types of higher education institutions, for example, would most likely
strengthen the proposed model. For instance, many private liberal arts colleges are
incorporating more and more elements of assessment activities into their strategic plans.
Studying their methodologies may enhance the evaluation methodologies used by larger
universities. Another possible avenue for further investigation would be to evaluate the
conceptual model presented here in relation to the concepts of Total Quality Management
or to the Balanced Scorecard. Both of these management approaches use various
perspectives to center on performance and strategy for the institution’s long-term success.
The conceptual model could certainly be applied in reviewing these approaches.
Specifically applying the conceptual evaluation model to institutions that have an Office
of Institutional Effectiveness could serve as the basis of another related research study.
Such offices commonly review assessment and evaluation findings from the various
academic and administrative units to demonstrate accountability, evaluate and improve
the quality of services, and determine whether or not the impact of the units is consistent
with the institution’s mission and vision. It could prove very interesting to test the
conceptual model in such an organizational structure. Alternatively, the conceptual
evaluation model presented here could be compared to or contrasted with some of the
assessment elements of the accrediting agencies or to the Baldrige National Quality
Program. One other possible permutation of this line of study could be to incorporate the
162
conceptual model in the development of a new strategic plan. By following the lifespan
of that plan, the research could then focus on the influence that systematic evaluation
brings to bear on the effectiveness of the strategic plan itself.
The current environment emphasizing accountability in institutions of higher
learning all but dictates that such institutions demonstrate the effectiveness of their
strategic plans. It is hoped, then, that additional research will replicate and enhance the
conceptual model presented here, and thereby further advance knowledge about the
evaluation of strategic planning efforts. In the meantime, the conceptual evaluation model
may provide academic practitioners with a comprehensive tool with which to evaluate
their strategic plans, significantly integrating both theoretical and practical perspectives
into the process.
Concluding Remarks
More than anything, strategic planning provides an opportunity to strengthen an
institution by fostering a vital conversation and dialogue about its core mission, and by
helping those most interested in the well-being of the institution to achieve a vision
featuring commonly held values. To foster such a conversation, an evaluation of the
strategic plan should include both qualitative and quantitative methods, and should
recognize the essential elements throughout the process. Evaluating these elements also
informs the next conversation about future goals. Strategic planning, if correctly done,
helps an institution to develop and continue courses of action that are a “roadmap to
success.” The key is to use the right elements in the evaluation and to develop a holistic
163
evaluation process. For when a strategic plan becomes muddled in an excess amount of
data, it ceases to be strategic; and conversely, when a strategic plan merely reports data, it
fails to be a plan. Above all, it should be meaningful; and as such, it is well worth
remembering Albert Einstein’s memorable admonition: “Not everything that can be
counted counts, and not everything that counts can be counted.”
BIBLIOGRAPHY
164
BIBLIOGRAPHY
AAU, 2005. http://www.aau.edu/aau/Policy.pdf retrieved on December 22, 2005. Anderes, T. (1996). Connecting Academic Plans to Budgeting: Key Conditions for Success. In B. P. Nedwek (Ed.), Doing Academic Planning: Effective Tools for Decision Making (pp. 129-134). Ann Arbor, MI: Society for College and University Planning (ERIC Document Reproduction Service No. ED 451 785) Anderes, T. (Fall 1999). Using Peer Institutions in Financial and Budgetary Analyses. New Directions for Higher Education, 107, 117-123. Anderson, T. J. (2000). Strategic planning, autonomous actions and corporate performance. Long Range Planning, 33, 184-200. Aguirre, F., & Hawkins, L. (1996). Why reinvent the wheel? Let’s adapt our institutional
assessment model. Paper presented at the New Mexico Higher Education Assessment Conference, Albuquerque, New Mexico. (ERIC Document Reproduction Service No. ED 393 393)
Alexander, F. K. (Jul/Aug 2000). The changing face of accountability. The Journal of Higher Education, 71(4), 411-431. Angelo, T. A. (1999) Doing assessment as if learning matters most. AAHE Bulletin, 51(9), 3 - 6. Atkinson, A. A., Waterhouse, J. H., & Wells, R. B. (Spring 1997). A stakeholder approach to strategic performance measurement. Sloan Management Review, 38(3), 25-37. Baldrige National Quality Program (2007). Education criteria for performance
excellence. Retreived May 5, 2007 from http://www.quality.nist.gov/PDF_files/2007_Education_Criteria.pdf
Banta, T. W. (2004) Developing assessment methods at classroom, unit, and university- wide levels. Retrieved November 2, 2004, from http://www.enhancementhemes.ac.uk/uploads%5Cdocuments%5Cbantapaperrevised.pdf
165
Banta, T. W., Lund, J. P., Black, K. E., & Oblander, F. W. (1996). Assessment in practice: Putting principles to work on college campuses. San Francisco: Jossey- Bass Inc. Bickman, L. (Ed.). (1987). Using program theory in evaluation. New Directions for Program Evaluation Series, 33, San Francisco: Jossey-Bass Inc. Birnbaum, R. (2000). Management Fads in Higher Education: Where They Come From,
What They Do, Why They Fail. San Francisco: Jossey-Bass Publishers.
Bloom, B. S. (1956). Taxonomy of Educational objectives: The classification of educational goals. Handbook I: Cognitive Domain. White Plains, N.Y.: Longman. Bogden, R. C., & Biklen, S. K. (1992). Qualitative research for education: An introduction to theory and methods. Neidham Heights, MA: Simon and Schuster. Bond, S. L., Boyd, S.E., & Rapp, K. A. (1997). Taking stock: A practical guide to evaluating your own programs. Chapel Hill, NC: Horizon Research, Inc. Retrieved May 5, 2006, from http://www.horizon-research.com Boyd, B. K., & Reuning-Elliot, E. (1998). A measurement model of strategic planning.
Strategic Management Journal, 19, 181-192.
Boyle, M., Jonas, P. M., & Weimer, D. (1997). Cyclical self-assessment: Measuring, monitoring, and managing strategic planning. In S. E. Van Kollenburg (Ed.), A Collection of Papers of Self-Study and Institutional Improvement (pp. 190-192). Chicago: North Central Association of Colleges and Schools, Commission on Institution of Higher Education. (ERIC Document Reproduction Service No. ED 408 880)
Cahoon, M. O. (1997). Planning our preferred future. In S. E. Van Kollenburg (Ed.), A Collection of Papers of Self-Study and Institutional Improvement (pp. 187-189). Chicago: North Central Association of Colleges and Schools, Commission on Institution of Higher Education. (ERIC Document Reproduction Service No. ED 408 880)
Cameron, K. S. (1985). Institutional effectiveness in higher education. An introduction. Review of Higher Education, 9(1), 1-4.
Chaffee, E. E. (1985). The concept of strategy: From business to higher education. In J. C. Smart (Ed.), Higher Education: Handbook of Theory and Research (pp. 133- 172). New York: Agathon Press.
166
Chmielewski, T. L., Casey, J. C., & McLaughlin, G. D. (June 2001). Strategic management of academic activities: Program portfolios. Paper presented at the Annual Meeting of Association for Institutional Research, Long Beach, CA. (ERIC Document Reproduction Service No. ED 456 782) Cistone, P. J., & Bashford J. (Summer 2002). Toward a Meaningful Institutional
Effectiveness Plan. Planning for Higher Education, 30(4), 15-23.
Cohen, S. H. (1997). A program assessment system that really works in improving the institution. In S.E. Van Kollenburg (Ed.), A Collection of Papers of Self-Study and Institutional Improvement (pp. 104-107). Chicago: North Central Association of Colleges and Schools, Commission on Institution of Higher Education. (ERIC Document Reproduction Service No. ED 408 880)
Cope, R. G. (1981). Strategic planning, management, and decision making. ASHE-ERIC Higher Education Research Report, no. 9 Washington, D.C.: American Association for Higher Education. (ERIC Document Reproduction Service No. ED 217 825)
Cordeiro, W. P., & Vaidya, A. (Summer 2002). Lessons learned from strategic planning. Planning for Higher Education, 30(4), 24-31.
Dickmeyer, N. (2004). Tips on Integrating Planning and Decision Making. National Association of College and University Business Officers. Retrieved July 14, 2004 from http://www.nacubo.org/x3466.xml
Dooris, M. J. (Dec 2002-Feb 2003). Two Decades of Strategic Planning. Planning for Higher Education, 31(2), 26-32. Dooris, M. J., & Lozier, G. G. (Fall 1990). Adapting Formal Planning Approaches: The
Pennsylvania State University. New Directions for Institutional Research: Adapting Strategic Planning to Campus Realities, 67, 5-21.
Dooris, M. J., Kelley, J. M., & Trainer, J. F. (Fall 2004). Strategic Planning in Higher Education. In M. J. Dooris, J. M. Kelley, and J. F. Trainer (Eds.), Successful Strategic Planning (pp. 5-10). San Francisco: Jossey-Bass Inc Dooris, M. J., & Sandmeyer, L. (October 2006). Planning for Improvement in the Academic Department. Effective Practices for Academic Leaders, 1(10), 1-16. Easterling, D., Johnson, B., & Wells, K. (1996). Creating the link between institutional
effectiveness and assessment. In S. E. Van Kollenburg (Ed.), A Collection of Papers of Self-Study and Institutional Improvement (pp. 151-156). Chicago: North Central Association of Colleges and Schools, Commission on Institution of Higher Education. (ERIC Document Reproduction Service No. ED 394 393)
167
Ehrenberg, R. G., & Rizzo, M. J. (July-August 2004). Financial Forces and the Future of American Higher Education. Academe, 90(4), 29-31.
Einstein, A. ThinkExist.com Retrieved May 10, 2007 from http://thinkexist.com/ quotation/not_everything_that_counts_can_be_counted-and_not/15536.html Epper, R. M., & Russell, A. B. (Oct 1996). Trends in State Coordination and
Governance: Historical and Current Perspectives. State Higher Education Executive Officers Association. (ERIC Document Reproduction Service No. ED 409 807)
Ewell, P. T. (2002). An emerging scholarship: A brief history of assessment. In T. W. Banta (Ed.), Building a Scholarship of Assessment (pp. 3-25). New York: John Wiley & Sons, Inc. Ewell, P. T. (2004). The examined life: Assessment and the ends of general education. Paper presented at the 2004 Assessment Institute, Indianapolis, IN. Ewell, P. T. (2005). Can assessment serve accountability? It depends on the question. In J. C. Burke (Ed.), Achieving Accountability in Higher Education (pp. 104-124). San Francisco: Jossey-Bass Inc. Erwin. T. D. (1991). Assessing student learning and development: A guide to the principles, goals, and methods of determining college outcomes. San Francisco: Jossey-Bass Inc. Feagin, J., Orum, A., & Sjoberg, G. (Eds.). (1991). A case for case study. Chapel Hill, NC: University of North Carolina Press. Freed, J. A., Klugman, M. R., & Fife, J. D. (1997). A culture for academic excellence;
Implementing the quality principles in higher education. ASHE-ERIC Higher Education Report, 25, 1. (ERIC Document Reproduction Service No. ED 406 963)
Glaister, K. W., & Falshaw, J. R. (1999). Strategic planning: Still going strong? Long Range Planning, 32, 107-116. Herriott, R. E., & Firestone, W. A. (1983). Multisite qualitative policy research: Optimizing description and generalizability. Educational Researcher 12, 14-19. Howell, E. (2000). Strategic planning for a new century: Process over product. ERIC
Clearinghouse for Community Colleges Digest. Office of Educational Research and Improvement.(ERIC Document Reproduction Service No. ED 447 842)
168
Jonas, S., & Zakel, L. (1997). Improving Institutional Effectiveness. In S. E. Van Kollenburg (Ed.), A Collection of Papers of Self-Study and Institutional Improvement (pp. 202-206). Chicago: North Central Association of Colleges and Schools, Commission on Institution of Higher Education. (ERIC Document Reproduction Service No. ED 408 880) Kaplan, R. S., & Norton, D. P. (January 1992). The balanced scorecard: Measures that
drive performance. Harvard Business Review, 71-79.
Kaplan, R. S., & Norton, D. P. (1996). Translating Strategy into Action: The Balanced Scorecard. Boston: Harvard Business School Press. Kater, S., & Lucius, C. (1997). As clear as mud? The difference between assessing
institutional effectiveness and student academic achievement. In S. E. Van Kollenburg (Ed.), A Collection of Papers of Self-Study and Institutional Improvement (pp. 88-90). Chicago: North Central Association of Colleges and Schools, Commission on Institution of Higher Education. (ERIC Document Reproduction Service No. ED 408 880)
Keller, G. (1983). Academic Strategy: The Management Revolution in American Higher Education. Baltimore: Johns Hopkins University Press.
Keller, G. (1997). Examining what works in strategic planning. In M.W. Peterson et al. (Eds.), Planning and management for a changing environment: A handbook on redesigning postsecondary institutions (pp. 52-60). San Francisco: Jossey-Bass, Inc.
Kotler, P., & Murphy, P. E. (1981). Strategic planning for higher education. Journal of Higher Education, 52, 470-489.
Kuh. G. D. (2005). Imaging asking the client: Using student and alumni surveys for accountability in higher education. In J. C. Burke (Ed.), Achieving Accountability in Higher Education (pp. 148-172). San Francisco: Jossey-Bass Inc.
Lang, A. (2001). In G. Lieberman, G. (Ed.), 3,500 Good Quotes for Speakers: A treasury of pointed observations, epigrams and witticisms to add spice to your speeches. (p.62). New York, NY: Broadway Books Leake, D. C., & Kristovich, S. A. R. (2002). Instructional Support Units: The final frontier… The voyages of a two-year community college in instructional effectiveness. In S. E. Van Kollenburg (Ed.), A Collection of Papers of Self-Study and Institutional Improvement (pp. 251-255). Chicago: North Central Association of Colleges and Schools, Commission on Institution of Higher Education. (ERIC Document Reproduction Service No. ED 469 349)
169
Linn, R. L. (March, 2000). Assessments and accountability. Educational Researcher,
29(2), 4 -16. Merriam, S. B. (1988). Case study research in education a qualitative approach. San Francisco, CA: Jossey-Bass. McLaughlin, J. S., McLaughlin, G. W., & Muffo, J. A. (2001). Using qualitative and quantitative methods for complementary purposes: A case study. In R. D. Howard & K. W. Borland, Jr. (Eds.), Balancing qualitative and quantitative information for effective decision support (pp. 15-44). San Francisco, CA: Jossey-Bass. McLeod, M., & Cotton, D. (Spring 1998). Essential decisions in institutional effectiveness assessment. Visions: The Journal of Applied Research for the Florida Association of Community Colleges, 39-42. Mintzberg, H. (1994). The Rise and Fall of Strategic Planning. New York: Free Press. Nunez, W. J. (December 2003). Faculty and academic administrator support for strategic planning in the context of postsecondary reform. Unpublished doctoral dissertation, University of Louisville. Nunez, W. J. (May 2004). Strategic planning in higher education: Assessing faculty and
administrative support in a reform environment. Paper presented at the 2004 Association for Institutional Research, Boston, MA.
Ohio State University (2004). Strategic Indicators 2004. Retrieved December 11, 2004 from http://oaa.osu.edu/irp/stratind/2004SIreport.pdf Palomba, C. A., & Banta, T. W. (1999). Assessment essentials: Planning, implementing and improving assessment in higher education. San Francisco: Jossey-Bass Inc. Park, J. E. (1997). A case study analysis of strategic planning in a continuing higher
education organization. Unpublished doctoral dissertation, Pennsylvania State University, Pennsylvania.
Parkland College Office of Institutional Research and Evaluation (Fall 2004). Parkland College Planning Handbook. Retrieved December 30, 2004 from http://www.parkland.edu/ie
Paris, K. (2004). Moving the Strategic Plan off the Shelf and into Action at the University of Wisconsin - Madison. In M. J. Dooris, J. M. Kelley, and J. F. Trainer (Eds.), Successful Strategic Planning (pp. 121-127) San Francisco: Jossey-Bass Inc.
170
Paris, K. (2005). Does it Pay to Plan? What We Learned about Strategic Planning in a Big Ten University. Retrieved February 23, 2006 from http://www.ncci-cu.org Paris, K. A., Ronca, J. M., & Stransky E. N. (2005). A Study of Strategic Planning on the University of Wisconsin-Madison Campus. Paper presented to the Wisconsin Center for the Advancement of Postsecondary Education, University of Wisconsin System Board of Regents. Patton, M. Q. (1987). How to use qualitative evaluation and research methods. Newbury, Park, CA: Sage Publications. Patton, M. Q. (1990). Qualitative evaluation and research methods. Newbury, Park, CA: Sage Publications. Prager, McCarthy, & Sealy. (2002). Ratio Analysis in Higher Education: New Insights for Leaders of Public Higher Education. Retrieved May 24, 2004 from http://www.kpmg.org/
Ray, E. (1998). University Performance Indicators and the Benchmarking Process. In J.
Carpenter-Hubin (Ed.), Strategic Indicators and Benchmarking Retrieved September 7, 2004, from http://www.pb.uillinois.edu/AAUDE/specialtopics.cfm
Rieley, J. B. (Apr.1997a). A comprehensive planning model. The Center for Continuous Quality Improvement. Milwaukee Area Technical College.(ERIC Document Reproduction Service No. ED 409 956)
Rieley, J. B. (1997b). Doing Effective Strategic Planning in a higher education environment. In S. E. Van Kollenburg (Ed.), A Collection of Papers of Self-Study and Institutional Improvement (pp. 175-181). Chicago: North Central Association of Colleges and Schools, Commission on Institution of Higher Education. (ERIC Document Reproduction Service No. ED 408 880) Rowley, D. J., Lujan, H. D., & Dolence, M. G. (1997). Strategic change in colleges and
universities. San Francisco: Jossey-Bass Inc.
Rue, L. W. (1973). The how and who of long-rang planning. Business Horizons, 16, 23- 30. Schmidtlein, F. A. (1989-1990). Why Linking Budgets to Plans Has Proven Difficult in Higher Education. Planning for Higher Education, 18(2), 9-23. Schmidtlein, F. A., & Milton, T. H. (1988-1989). College and university planning: Perspectives from a nation-wide study. Planning for Higher Education, 17(3),1- 19.
171
Schmidtlein, F. A. & Milton, T. H. (Fall 1990). Adapting Strategic Planning to Campus
Realities New Directions for Institutional Research: Adapting Strategic Planning to Campus Realities, 67, 1- 2.
Schwarzmueller, E. B. & Dearing, B. (1997). A model for non-instructional program
review. In S. E. Van Kollenburg (Ed.), A Collection of Papers of Self-Study and Institutional Improvement (pp. 116-119). Chicago: North Central Association of Colleges and Schools, Commission on Institution of Higher Education. (ERIC Document Reproduction Service No. ED 408 880)
Shirley, R. C. (1988). Strategic planning: An overview. New Directions for Higher
Education, 16(4), 5 -14.
Smith, M. F. (July-August 2004). Growing Expenses, Shrinking Resources: The States and Higher Education. Academe, 90(4), 33-35. Stewart, A. C., & Carpenter-Hubin, J. (Winter 2000-2001). The balanced scorecard: Beyond reports and rankings. Planning for Higher Education, 29(2), 37-42. Taylor, A. L. (1989). Institutional effectiveness and academic quality. In C. Fincher (Ed.), Assessing institutional effectiveness: Issues, methods, and management (pp. 11–18). Athens, GA: Georgia University Institute of Higher Education.
Taylor, A. L., & Karr, S. (1999). Strategic planning approaches used to respond to issues
confronting research universities. Innovative Higher Education 23(3), 221- 234.
Tellis, W. (1997). Application of a case study methodology. The Qualitative Report 3, http://www.nova.edu/ssss/QR/QR3-3/tellis2.html retrieved on June 22, 2005. Terenzini P. T. & Upcraft, M. L. (1996) Assessing program and service outcomes. In M. L. Upcraft & J. H. Schuh (Eds.), Assessment in student affairs: A guide for practitioners (pp. 217 – 239). San Francisco: Jossey-Bass Inc.
Torres, C. A. R. (2001). An assessment process for strategic planning in a higher education institution Unpublished doctoral dissertation, Dowling College. Upcraft, M. L., & Schuh, J. H. (1996). Assessment in student affairs: A guide for practitioners. San Francisco; Jossey-Bass Inc. W. K. Kellogg Foundation (January 2004). Logic Model Development Guide: Using Logic Models to Bring Together Planning, Evaluation, and Action Retrieved May 2, 2006, from http://www.wkkf.org
172
Wholey, J. S., Hatry, H. P., & Newcomer, K. E. (Eds). (1994). Handbook of Practical Program Evaluation. San Francisco: Jossey-Bass Inc. Winstead, P. C., & Ruff, D. G. (1986). The evolution of institutional planning models in
higher education. Orlando, FL: Association for Institutional Research (ERIC Document Reproduction Service No. ED 280 412)
Yin, R. K. (1994). Case Study Research: Design and Methods. Thousand Oaks, CA: Sage
Publications.
APPENDICES
173
APPENDIX A
FLOWCHART OF THE CASE STUDY METHODOLOGY
Proposal Approval from
Committee
IRB Review and Approval
Preview AAU Public Institutions and Select Peers
Develop Interview Questions
Expert Panel
Review
CASE STUDY PROCEDURE
Universities A, B, C & D
Survey: 1. Administer
2. Analyze Results Document Review
Semi-structured Interviews
(Upper -level Administrators )
Interviewees check transcript notes
Review Data and Determine if there is Enough Data to
Triangulate No
Yes
Data Analysis
Review Data and Themes
Final Analysis
Write Dissertation
174
APPENDIX B
LIST OF DOCUMENTS FOR ANALYSIS
1. List of documents on the institution’s website or on the institution’s specific website for their Strategic Plan.
2. The Strategic Plan.
3. List of offices or individuals that are main participants, such as those listed in committees or task force.
4. The document that specifies the measures (benchmarks, metrics, targets).
5. The document that specifies the timeline of the plan.
6. The document that specifies the strategic plan process at that institution.
7. Documents on the implementation of the plan, which may include evaluation strategy.
8. Reports of progress, or annual reports.
9. Strategic plan budgetary or funding reports.
10. Any public documents that are related to the strategic plan evaluation, such as governance committee minutes or task force committee minutes.
175
APPENDIX C
STRATEGIC PLANNING EVALUATION SURVEY
Stro
ngly
A
gree
Som
ewha
t A
gree
Nei
ther
Agr
ee n
or
Dis
agre
e
Som
ewha
t D
isag
ree
Stro
ngly
Dis
agre
e
Don
't K
now
/
No
Opi
nion
1 I have evaluated and offered advice about the strategic plan at my university.
2 Procedures for assessing goal attainment are clearly stated at my institution.
3 Institutional research (data collection and analysis) is an integral part of my institution's strategic planning process.
4 My institution dedicates resources to strategic planning activities.
5 My institution has a systematic process of strategic planning.
6 I serve (or have served) on a strategic planning committee or workgroup at my institution.
7 I am (or have been) involved in strategic planning activities at my institution.
8 Specific changes at my institution have occurred through systematic evaluation of our strategic planning results.
9 The benefits of strategic planning are not worth the resources we invest in our process.
10 Our strategic planning process has been developed to "fit" the institution.
11 My institution is committed to allocating resources to remedy areas of weakness found through the strategic planning process.
12 I use the results of strategic planning activities to evaluate the distribution of resources.
176
Yes No 13 Are you involved in the collection of data for
the strategic plan?
Prov
ide
Dat
a
Dat
a C
olle
ctio
n
Dat
a A
naly
sis
Intre
pret
R
esul
ts
Fina
l Ev
alua
tion
14 If yes, what aspects of the data collection do you provide? (Check all that apply.)
15 In what office, area, or unit are you? 16 What offices, areas, or units are the main participants in the evaluation of the
strategic plan? 17 Briefly describe the method or steps your institution employs to evaluate the
strategic plan.
Thank you for your participation!
177
APPENDIX D
SURVEY INVITATION
February 2007 <<Participant Name>> <<Participant Address>> <<City>> <<State>> <<Zip Code>> Dear <<Participant Name>>: All too often, strategic planning in academia ultimately proves futile due to the lack of deliberate assessment of progress made on the plan’s goals, even though this critical phase can help the institution to evaluate the effectiveness of its plan, reevaluate the plan’s evolution, and continue to build upon its successes. I am therefore currently conducting dissertation research on the evaluation of strategic plan initiatives. Specifically, my research project will take the form of a case study focusing on a handful of AAU institutions with campus-wide strategic plans. As such, it will identify methodological strengths and challenges from which new strategies may be identified, and ways in which institutions may improve their own strategic planning processes. You have been selected to participate in this research study because of your experience with your institution’s strategic plan initiative. Not only will your perceptions directly inform this study, they will also, in turn, contribute to the growing literature on strategic planning. In addition, they may also prove useful for a number of planning officers in the Committee of Institutional Cooperation (CIC) who have expressed strong interest in the results. I hope you will be willing to complete a short survey asking for your opinions regarding the various components involved in evaluating your institution’s strategic plan. The time required to participate is minimal -- approximately 10 minutes or less. You may complete the survey through the following website:
URL: http://web.ics.purdue.edu/~mld/planevaluation.htm Due to the relatively small number of people with experience in strategic planning, a high response rate is critical to the success of my research. Therefore, I sincerely hope you will choose to participate. If, however, you do not wish to participate in this study, you can avoid future correspondence by contacting me via e-mail at [email protected] or by phone at 765-494-7108. Please note that your responses will be completely anonymous.
178
Whether you agree to participate or not, I will not identify you or your institution in any paper or presentation. Thank you in advance for your assistance with this research. Sincerely, Margaret Dalrymple Ph.D. Candidate in Higher Education Administration Department of Educational Studies Purdue University
179
APPENDIX E
INTERVIEW QUESTIONS
1. How would you describe the methods or steps your institution employs in evaluating the strategic plan?
2. Are the originally formulated strategic plan measures still the primary determining factors in the evaluation of the strategic plan?
3. Are there other factors that figure into the evaluation of the strategic plan?
4. Are the procedures for assessing goal attainment well-understood by the main participants?
5. Who is actively involved in the evaluation of the strategic plan?
6. Have the results of the strategic planning activities been used to evaluate the distribution of resources?
7. Did the strategic plan funds received match the original plan of the specific resources identified in the strategic plan?
8. Did the allocation of the strategic plan funds received match the originally planned strategic plan goals?
9. From your perspective, is your institution committed to allocating resources to remedy areas of weakness identified through the evaluation of the strategic plan?
10. Having now systematically evaluated your strategic plan results, what specific changes have occurred at your institution as a result?
11. Now in hindsight, is there anything in particular that you would have done differently?
180
APPENDIX F
INTERVIEW INVITATION
February 2007 <<Participant Name>> <<Participant Address>> <<City>> <<State>> <<Zip Code>> Dear <<Participant Name>>: As a doctoral candidate in the Department of Educational Studies at Purdue University, I am currently conducting dissertation research on the evaluation of strategic plan initiatives. In reviewing the methodologies used to evaluate campus-wide strategic plans at a handful of AAU institutions, my study will identify relative strengths and challenges in the process and suggest ways in which similar institutions might improve their own strategic planning processes. I believe this study will prove widely beneficial by contributing to the growing literature on strategic planning, detailing various evaluation methods currently in use, and offering a potential template for future evaluation methods. Indeed, several planning officers in the Committee of Institutional Cooperation (CIC) have already expressed interest in the results. I am therefore writing to you today to ask if you would be willing to share some of the insights you have gained in evaluating your campus’ strategic plan. Your perceptions would directly inform this study and, in turn, contribute to the growing literature on strategic planning. Specifically, I would like to conduct a confidential semi-formal interview with you, or a designee, to discuss your opinions on the various components of the evaluation of your institution’s strategic plan. Thank you very much for considering my request. I will be in touch with you in the near future to discuss your thoughts on this matter and, should you be amenable, to schedule an interview time. Sincerely, Margaret Dalrymple Ph.D. Candidate in Higher Education Administration Department of Educational Studies Purdue University
181
APPENDIX G
EXPERT PANEL EVALUATION FORM
Please review the interview questions for their appropriateness and clarity and provide your comments, suggestions, and overall evaluation. Your feedback is an important component to the validity of this study. Scale for the evaluation:
5=Excellent, 4= Good, 3= Average, 2= Below Average, 1= Poor 5 4 3 2 1 1. How would you describe the methods or steps your institution employs in evaluating the strategic plan?
2. Are the originally formulated strategic plan measures still the primary determining factors in the evaluation of the strategic plan?
3. Are there other factors that figure into the evaluation of the strategic plan?
4. W hat are the procedures for assessing goal attainment?
5. Is the evaluation of the strategic plan, including the data collection and analysis, an integral part of the institution’s planning process?
6. Who is actively involved in the evaluation of the strategic plan?
7. Have the results of the strategic planning activities been used to evaluate the distribution of resources?
8. Did the strategic plan funds received match the original plan of the specific resources identified in the strategic plan?
9. Did the allocation of the strategic plan funds received match the originally planned strategic plan goals?
10. From your perspective, is your institution committed to allocating resources to remedy areas of weakness identified through the evaluation of the strategic plan?
11. Having now systematically evaluated your strategic plan results, what specific changes have occurred at your institution as a result?
182
12. Now in hindsight, is there anything in particular that you would have done differently?
Please provide any comments or suggestions.
183
APPENDIX H
DOCUMENTS REVIEWED
University A The original plan Annual reports (five years) Thirteen news releases Websites: The strategic plan Board of Trustees
Office of the President Office of the Provost Office of Institutional Research
University B The original plan Annual reports (six years) One journal article Three Board of Trustee speeches Three university-wide memorandums Ten news releases Minutes from a Board of Trustees meeting Minutes from a Faculty governance meeting Minutes from a Question and Answer meeting with the President Presentation to the Board of Trustees Websites: The Board of Trustees
Office of the President Office of Academic Affairs Office of Business and Finance Office of Institutional Research and Planning
University C The original plan Annual reports (four years) A university-wide memorandum Minutes from a Board of Trustees meeting Two journal articles Two Board of Trustee speeches Two news releases Three newsletters Three brochures Six presentations to the Board of Trustee meetings
184
Websites: The Board of Trustees Office of the President Office of the Executive Vice President and Provost Office of Planning and Institutional Assessment Office of Finance and Budget
University D The original plan Draft of the original plan Annual reports (three years) Two data reports Two presentations to the Board of Trustee meeting Four minutes from Board of Trustees meetings Seven Presidential speeches Twelve news releases Websites: The Board of Trustees
Office of the Chancellor Office of the Executive Vice Chancellor and Provost Office of Institutional Research and Assessment Office of Financial Planning and Budget The accreditation re-affirmation process
185
APPENDIX I
INTERVIEW SUBJECTS
University A Provost Director of Strategic Planning and Assessment Vice Provost for Research Vice Provost for Engagement Director of Budget and Fiscal Planning Director of Institutional Research University B Executive Vice President of Business and Treasurer Executive Vice Provost for Academic Affairs Director of Institutional Research University C Executive Vice President of Business and Treasurer Executive Director of the Office of Planning and Institutional Assessment Director of Planning Research and Assessment University D Executive Associate Provost Interim Director of Financial Planning and Budget
186
APPENDIX J
INTERVIEW QUESTIONS INCORPORATED INTO LOGIC MODEL
Interview Questions Logic Model Component 1 How would you describe the methods or steps
your institution employs in evaluating the strategic plan?
Output - an analysis of goals and objectives in measurable units
2 Are the originally formulated strategic plan measures still the primary determining factors in the evaluation of the strategic plan?
Input - a reflection of the progress made toward the goals.
3 Are there other factors that figure into the evaluation of the strategic plan?
Input - influences and resources
4 What are the procedures for assessing goal attainment?
Output - an analysis of goals and objectives in measurable units
5 Is the evaluation of the strategic plan, including the data collection and analysis, an integral part of the institution’s planning process?
Outcome - Short-term outcome that include specific changes in attitudes, behaviors, knowledge, and skill.
6 Have the results of the strategic planning activities been used to evaluate the distribution of resources?
Outcome - long-term outcomes
7 Who is actively involved in the evaluation of the strategic plan?
Outcome - Short-term outcome that include specific changes in attitudes, behaviors, knowledge, and skill.
8 Did the strategic plan funds received match the original plan of the specific resources identified in the strategic plan?
Activities - describes the resources needed to implement the program and the intentions of the program.
9 Did the allocation of the strategic plan funds received match the originally planned strategic plan goals?
Activities - describes the resources needed to implement the program and the intentions of the program.
10 From your perspective, is your institution committed to allocating resources to remedy areas of weakness identified through the evaluation of the strategic plan?
Outcome - long-term outcomes
187
11 Having now systematically evaluated your strategic plan results, what specific changes have occurred at your institution as a result?
Impact - reflection on the effectiveness and magnitude of the strategic plan.
12 Now in hindsight, is there anything in particular that you would have done differently?
Impact - reflection on the effectiveness and satisfaction of the strategic plan.
188
APPENDIX K
UNIVERSITY A SURVEY RESULTS
Percent that Responded
with “Somewhat Agree” or “Strongly Agree”
Number of Respondents
Input Questions Our strategic planning process has been developed to "fit" the institution.
72% 13
The vision of my institution is clearly defined. 100% 18 The institutional mission is central to the strategic planning process.
88% 16
Faculty, staff, and administrators are encouraged to participate in the long-range planning process of the institution.
88% 16
Activities Questions Procedures for assessing goal attainment are clearly stated at my institution.
94% 17
I serve (or have served) on a strategic planning committee or workgroup at my institution.
94% 17
I am (or have been) involved in strategic planning activities at my institution.
88% 16
Faculty, staff, and administrators are actively involved in the strategic planning process.
77% 14
I have engaged in specific planning exercises to aid in my institutions strategic planning activities.
77% 14
I have helped formulate outcome measures to measure progress.
77% 14
Output Questions My institution has a systematic process of strategic planning.
94% 17
Day-to-day decisions made by institutional decision-makers are consistent with long-range goals and objectives.
83% 15
My institution has a formal process for gathering and sharing internal and external data.
94% 17
189
Outcome Questions Institutional research (data collection and analysis) is an integral part of my institutions strategic planning process.
94% 17
My institution dedicates resources to strategic planning activities.
94% 17
My institution is committed to allocating resources to remedy areas of weakness found through the strategic planning process.
94% 17
The results of strategic planning activities are used to evaluate the distribution of resources.
83% 15
Impact Questions I have evaluated and offered advice about the strategic plan at my university.
88% 16
Specific changes at my institution have occurred through systematic evaluation of our strategic planning results.
88% 16
The benefits of strategic planning are not worth the resources we invest in our process. (Item reverse coded)
94% 17
Strategic planning plays an important role in the success of my institution.
94% 17
Efforts to evaluate the strengths, weaknesses, opportunities, and threats of my institution are worthwhile.
94% 17
Resources dedicated to strategic planning activities are investments in the long-term health of my institution.
83% 15
Strategic planning activities do little to help our institution to fulfill its mission. (Item reverse coded)
88% 16
190
APPENDIX L
UNIVERSITY B SURVEY RESULTS
Percent that Responded
with “Somewhat Agree” or “Strongly Agree”
Number of Respondents
Input Questions Our strategic planning process has been developed to "fit" the institution.
89% 8
The vision of my institution is clearly defined. 89% 8 The institutional mission is central to the strategic planning process.
100% 9
Faculty, staff, and administrators are encouraged to participate in the long-range planning process of the institution.
67% 6
Activities Questions Procedures for assessing goal attainment are clearly stated at my institution.
100% 9
I serve (or have served) on a strategic planning committee or workgroup at my institution.
55% 5
I am (or have been) involved in strategic planning activities at my institution.
89% 8
Faculty, staff, and administrators are actively involved in the strategic planning process.
89% 8
I have engaged in specific planning exercises to aid in my institutions strategic planning activities.
89% 8
I have helped formulate outcome measures to measure progress.
78% 7
Output Questions My institution has a systematic process of strategic planning.
78% 7
Day-to-day decisions made by institutional decision-makers are consistent with long-range goals and objectives.
89% 8
My institution has a formal process for gathering and sharing internal and external data.
89% 8
191
Outcome Questions Institutional research (data collection and analysis) is an integral part of my institutions strategic planning process.
100% 9
My institution dedicates resources to strategic planning activities.
89% 8
My institution is committed to allocating resources to remedy areas of weakness found through the strategic planning process.
78% 7
The results of strategic planning activities are used to evaluate the distribution of resources.
89% 8
Impact Questions I have evaluated and offered advice about the strategic plan at my university.
89% 8
Specific changes at my institution have occurred through systematic evaluation of our strategic planning results.
89% 8
The benefits of strategic planning are not worth the resources we invest in our process. (Item reverse coded)
100% 9
Strategic planning plays an important role in the success of my institution.
89% 8
Efforts to evaluate the strengths, weaknesses, opportunities, and threats of my institution are worthwhile.
100% 9
Resources dedicated to strategic planning activities are investments in the long-term health of my institution.
89% 8
Strategic planning activities do little to help our institution to fulfill its mission. (Item reverse coded)
100% 9
192
APPENDIX M
UNIVERSITY C SURVEY RESULTS
Percent that Responded
with “Somewhat Agree” or “Strongly Agree”
Number of Respondents
Input Questions Our strategic planning process has been developed to "fit" the institution.
89% 8
The vision of my institution is clearly defined. 78% 7 The institutional mission is central to the strategic planning process.
89% 8
Faculty, staff, and administrators are encouraged to participate in the long-range planning process of the institution.
100% 9
Activities Questions Procedures for assessing goal attainment are clearly stated at my institution.
78% 7
I serve (or have served) on a strategic planning committee or workgroup at my institution.
100% 9
I am (or have been) involved in strategic planning activities at my institution.
89% 8
Faculty, staff, and administrators are actively involved in the strategic planning process.
100% 9
I have engaged in specific planning exercises to aid in my institutions strategic planning activities.
100% 9
I have helped formulate outcome measures to measure progress.
78% 7
Output Questions My institution has a systematic process of strategic planning.
100% 9
Day-to-day decisions made by institutional decision-makers are consistent with long-range goals and objectives.
89% 8
My institution has a formal process for gathering and sharing internal and external data.
100% 9
193
Outcome Questions Institutional research (data collection and analysis) is an integral part of my institutions strategic planning process.
100% 9
My institution dedicates resources to strategic planning activities.
100% 9
My institution is committed to allocating resources to remedy areas of weakness found through the strategic planning process.
78% 7
The results of strategic planning activities are used to evaluate the distribution of resources.
78% 7
Impact Questions I have evaluated and offered advice about the strategic plan at my university.
100% 9
Specific changes at my institution have occurred through systematic evaluation of our strategic planning results.
78% 7
The benefits of strategic planning are not worth the resources we invest in our process. (Item reverse coded)
89% 8
Strategic planning plays an important role in the success of my institution.
89% 8
Efforts to evaluate the strengths, weaknesses, opportunities, and threats of my institution are worthwhile.
89% 8
Resources dedicated to strategic planning activities are investments in the long-term health of my institution.
89% 8
Strategic planning activities do little to help our institution to fulfill its mission. (Item reverse coded)
89% 8
194
APPENDIX N
UNIVERSITY D SURVEY RESULTS
Percent that Responded
with “Somewhat Agree” or “Strongly Agree”
Number of Respondents
Input Questions Our strategic planning process has been developed to "fit" the institution.
78% 7
The vision of my institution is clearly defined. 89% 8 The institutional mission is central to the strategic planning process.
89% 8
Faculty, staff, and administrators are encouraged to participate in the long-range planning process of the institution.
78% 7
Activities Questions Procedures for assessing goal attainment are clearly stated at my institution.
89% 8
I serve (or have served) on a strategic planning committee or workgroup at my institution.
89% 8
I am (or have been) involved in strategic planning activities at my institution.
100% 9
Faculty, staff, and administrators are actively involved in the strategic planning process.
55% 5
I have engaged in specific planning exercises to aid in my institutions strategic planning activities.
89% 8
I have helped formulate outcome measures to measure progress.
78% 7
Output Questions My institution has a systematic process of strategic planning.
44% 4
Day-to-day decisions made by institutional decision-makers are consistent with long-range goals and objectives.
56% 5
My institution has a formal process for gathering and sharing internal and external data.
100% 9
195
Outcome Questions Institutional research (data collection and analysis) is an integral part of my institutions strategic planning process.
89% 8
My institution dedicates resources to strategic planning activities.
100% 9
My institution is committed to allocating resources to remedy areas of weakness found through the strategic planning process.
56% 5
The results of strategic planning activities are used to evaluate the distribution of resources.
78% 7
Impact Questions I have evaluated and offered advice about the strategic plan at my university.
89% 8
Specific changes at my institution have occurred through systematic evaluation of our strategic planning results.
78% 7
The benefits of strategic planning are not worth the resources we invest in our process. (Item reverse coded)
78% 7
Strategic planning plays an important role in the success of my institution.
44% 4
Efforts to evaluate the strengths, weaknesses, opportunities, and threats of my institution are worthwhile.
100% 9
Resources dedicated to strategic planning activities are investments in the long-term health of my institution.
78% 7
Strategic planning activities do little to help our institution to fulfill its mission. (Item reverse coded)
89% 8
VITA
196
VITA
MARGARET L. DALRYMPLE
3009 Georgeton Road West Lafayette, IN 47906 EDUCATION
PH.D. IN HIGHER EDUCATION ADMINISTRATION 2007 Purdue University West Lafayette, Indiana M.A. IN SOCIOLOGY 1992 University of Colorado Colorado Springs, Colorado B.A. IN SOCIOLOGY AND PSYCHOLOGY 1990 Augustana College Sioux Falls, South Dakota
EXPERIENCE
ASSISTANT DIRECTOR OF INSTITUTIONAL RESEARCH 2005 to Present Office of Institutional Research, Purdue University West Lafayette, Indiana SENIOR INSTITUTIONAL RESEARCH ANALYST 2003 to 2005 Office of Institutional Research, Purdue University West Lafayette, Indiana ASSOCIATE REGISTRAR FOR RESEARCH 2000 to 2003 Office of the Registrar, Purdue University West Lafayette, Indiana
197
RESEARCH ANALYST 1995 to 2000 Office of the Registrar, Purdue University West Lafayette, Indiana RESEARCH ASSISTANT 1992 Center for Social Science Research, University of Colorado Colorado Springs, Colorado
EXPERTISE SPECIAL PROJECTS
Investment Return Analysis Designer and coordinator of data collection for Purdue University West Lafayette Campus
Strategic Plan Metrics and Benchmarks Progress Report Coordinator of data collection for Purdue University System
President Forums Coordinator of data collection and designer for Purdue University West Lafayette Campus
Board of Trustees Governance Report Coordinator of data collection and designer for Purdue University System
Graduating Students Learning Outcomes Survey Coordinator and administrator of the Purdue University survey (West Lafayette Campus)
Higher Education Research Institute (HERI) College Student Survey (CSS) Administrator for Purdue University West Lafayette Campus Higher Education Research Institute (HERI) Cooperative Institute Research
Program (CIRP) Survey Analyst for Purdue University West Lafayette Campus
National Study of Faculty and Students (NSoFaS) Coordinator of data collection for Purdue University West Lafayette Campus
Enrollment Projections Generated fall and spring enrollment projections for Purdue University West Lafayette Campus
Integrated Postsecondary Education Data System (IPEDS) Enrollment Survey, Completions Survey, and Graduation Rates Survey
Completed for Purdue University System Indiana Commission of Higher Education (ICHE) Legislature Request for
institutional data Completed for Purdue University System
198
PROFESSIONAL ACTIVITIES Member-at-Large: Membership Committee Chair,
Indiana Association for Institutional Research, 2006-07
Presentation Proposals Reviewer for Track Four: Informing Institutional Management and Planning
Association for Institutional Research National Conference, Seattle, Washington, May 2008
Presentation Proposals Reviewer for Track Five: Higher Education Collaborations, Policy Issues, and Accountability Association for Institutional Research National Conference, Chicago, Illinois, June 2006
Presentation Proposals Reviewer for Track Five: The Practice of Institutional Research Association for Institutional Research National Conference, Tampa, Florida, June 2003
Newcomers’ Orientation Workshop Session Presenter Indiana Association for Institutional Research Conference, Nashville, Indiana, April 2003 Indiana Association for Institutional Research Conference, Nashville, Indiana, March 2001 Association for Institutional Research National Conference, Cincinnati, Ohio, May 2000 Indiana Association for Institutional Research Conference, Nashville, Indiana, March 2000
Newcomers’ Workshop Committee Member Association for Institutional Research National Conference, Toronto, Ontario, Canada, June 2002
Session Facilitator Association for Institutional Research National Conference 2005, 2003, 2001, 2000, 1999, & 1998
MEMBERSHIPS IN PROFESSIONAL ASSOCIATIONS: Association for Institutional Research (AIR) Association for the Study of Higher Education (ASHE) Association of American University Data Exchange (AAUDE) Indiana Association for Institutional Research (INAIR) PRESENTATIONS “INAIR Best Presentation: Communicating the Strategic Plan” Association for Institutional Research National Conference Kansas City, Missouri, June 2007
199
“Lessons Learned about Communicating the Strategic Plan” Association for Institutional Research National Conference Chicago, Illinois, May 2006 “The Missing Link: Evaluating a Strategic Plan Initiative” Association for Institutional Research National Conference San Diego, California, May 2005 “Getting the Word Out: Communicating the Strategic Plan” Indiana Association for Institutional Research Conference Greencastle, Indiana, March 2005 "The Perceptions of Indiana Institutional Researchers on the Role of Institutional Research in the Strategic Planning Process" Indiana Association for Institutional Research Conference Terre Haute, Indiana, April 2004 “Beyond Guesswork: One University’s Example of Projecting Undergraduate Enrollment” Association for Institutional Research National Conference Tampa, Florida, May 2003 "The Adventures of Enrollment Projections" Indiana Association for Institutional Research Conference Nashville, Indiana, April 2003 "Ingredients for Enrollment Projections: Lots of Statistics, a Bit of Gut-feeling, Some Guesswork, and a Little Luck" Indiana Association for Institutional Research Conference Nashville, Indiana, April 2003 "Effective Report Preparation: Streamlining the Reporting Process" Indiana Association for Collegiate Registrar and Admission Officers Conference Indianapolis, Indiana, October 1999 "Effective Report Preparation: Streamlining the Reporting Process" Association for Institutional Research National Conference Seattle, Washington, May 1999 "Effective Report Preparation: Streamlining the Reporting Process" Indiana Association for Institutional Research Conference Nashville, Indiana, March 1999
200
"A New Focus for Institutional Researchers: Developing and Using a Student Decision Support System" Association for Institutional Research National Conference Minneapolis, Minnesota, May 1998 "A New Focus for Institutional Researchers: Developing and Using a Student Decision Support System" Indiana Association for Institutional Research Conference Nashville, Indiana, March 1998 "A Study of the Delimination of the Dissertation Process" Western Social Science Association Conference Denver, Colorado, April 1991 HONORS AND AWARDS
Years of Service Recognition 2006
Office of the Provost, Purdue University Best Presentation 2005
Indiana Association for Institutional Research Individual Professional Development Grant 1999
Administrative and Professional Staff Advisory Committee, Purdue University Student Services New Professional Award Nominee 1997
Office of the Registrar, Purdue University PUBLICATIONS
"A New Focus for Institutional Researchers: Developing and Using a Student Decision
Support System," Association for Institutional Research Professional File (Winter 1999)
Book review of Ambassadors of U.S. Higher Education: Quality Credit-Bearing Programs Abroad, Eds. J. Dupree and M. P. Lenn, in College & University 74, 2