The CIPP Organizational Process Model - An Application to Military Test Range Management Joe Stufflebeam, Ph.D. White Sands Missile Range, NM Daniel Stufflebeam, Ph.D. Western Michigan University (Distinguished University Professor - Retired) International Test & Evaluation Association 33rd Annual Intl Test and Evaluation Symposium Reston, VA 05 October 2016 DISTRIBUTION A: Approved for public release; distribution is unlimited. OPSEC review conducted on 27 SEP 2016
29
Embed
The CIPP Organizational Process Model Symposium/2016...The CIPP Organizational Process Model - An Application to Military Test Range Management Joe Stufflebeam, Ph.D. White Sands Missile
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
The CIPP Organizational Process Model -An Application to Military Test Range Management
Joe Stufflebeam, Ph.D.White Sands Missile Range, NM
Daniel Stufflebeam, Ph.D.Western Michigan University
(Distinguished University Professor - Retired)
International Test & Evaluation Association33rd Annual Intl Test and Evaluation Symposium
Reston, VA05 October 2016
DISTRIBUTION A: Approved for public release; distribution is unlimited.
OPSEC review conducted on 27 SEP 2016
Need Statement
• Define and develop an efficient and effective modern implementation of the “Mission Execution Process”
The Universal Documentation System (UDS)Mission Execution Process
The current mission execution process is driven by the RCC defined Universal Documentation System (UDS)
Goal: Maintain functionality of the UDS process, but implement in a more efficient & effective process
Universal Documentation System(Defines documentation for requirements & resource commitments)
Program Introduction (PI): The initial planning document submitted by a potential user to the support range immediately upon identification of general program requirements and schedules.
Statement of Capability (SC): The range’s response to the PI. It provides the user with a preliminary cost estimate, acceptance of the program and/or prerequisites for support.
Customer Range
Operations Requirements (OR): The OR is a mission-oriented document that describes the specific requirements for each mission, special test, or series of test in detail. It is prepared by the user.
Program Requirements Document (PRD): The PRD, normally used for complex or long lead-time programs, contains detailed program support requirements identified by the user.
Operations Directive (OD): The OD is the range’s response to the OR and is a detailed plan for implementation of support functions for a specific test or series of tests.
Program Support Plan (PSP): The PSP is the range’s response to the PRD and contains information relating to support commitments, including any alternatives.
Requirements for an Enhanced Process• Efficiency (Faster, Cheaper)
– Shorter time to test while maintaining information exchange rigor
– Improve mission throughput
– Quicker turn-around
– More timely delivery of data products
– Reduce operational costs and institutional costs
– Eliminate efforts and data products that have little or no value
• Effectiveness (Better)– Maximize the quality of data products
– Create new data products that add value to the customer
– Ensure data products are tied to customer requirements
– Ensure customer understands their requirements vs. expected data products
6
How to Meet the Need?
• Implement a mission execution process and toolset that facilitates
convergence to & sustainment of quality & efficiency
• Implement a toolset that is designed to:
• Meet stated requirements
• Improve management insight into details of the overall process
• Improve communication across mission execution components
• Provide meaningful and consistent feedback through integrated evaluation &
assessment
• Improve the transition and integration of development efforts and other
investments
• Increased standardization and automation. Eliminate developmental stovepipes.
Mission Execution Process
Operational SOPs & Work Instruction
Capture, Validate & Prioritize
Requirements
Planning, Budgeting,
Scheduling & Design
MissionExecution
Analysis, Outcomes, Results, Data
Products
Generalized Range Mission Execution Process
• How do we support this process?
• How do we evaluate the performance of this process?
• How do we manage this process?
• How do we improve this process?
Customer Request
Delivered Products
Business Process
Leveraging Well Established Process Models
Lean Six Sigma Origins in Manufacturing
• Tailored for building large numbers of the same thing
• Focused on eliminating waste and variation
Context-Input-Process-Product (CIPP) Origins in Government & Education
• The CIPP Evaluation Model is a systematic approach to evaluation
• Designed to assist in the planning and execution of programs
• Facilitates accountability within an organization
Process Improvement Documentation RepositoryPerformance Metrics Dashboard QA Analysis
Visualization w/ Pre-Mission
Plug-ins
Visualization w/ Real-Time
Plug-ins
Visualization w/ Data Reduction
Plug-ins
Pre-Mission – Mission Planning Tool
Real-Time Visualization Tool
Post Test - Target Miss Distance Tool
Moving Forward
• Develop a mission process evaluation manual and checklists, built on the CIPP model
• Build automation capabilities into software systems
• Develop a database and metrics dashboard to house performance information
• Develop a management strategy and an implementation plan for defining, executing and monitoring process improvement efforts
Final Thoughts• The UDS process:
– Requires a significant amount of time and resources in its current implementation.
– Current tools are inefficient and don’t facilitate automation. As a result, the process has room for improvement.
• The current state of affairs at the range dictates a more efficient and effective process.
• The CIPP process model
– Provides the required avenue for an efficient and effective realization of the mission execution process.
– Assists in the planning and execution of programs
– Facilitates accountability within an organization
Questions?
Joe Stufflebeam, Ph.D.TRAX International
White Sands Missile Range, NM
CIPP Evaluation
• Context evaluations assess needs, problems, assets, and opportunities, plus relevant contextual conditions and dynamics.
• Input evaluations assess a program’s strategy, action plan, staffing arrangement, and budget for feasibility and potential cost-effectiveness to meet targeted needs and achieve goals. An input evaluation may be comparative as in identifying and assessing optional ways to achieve a program’s goals, or non-comparative in assessing a single plan and its components.
• Process evaluations monitor, document, assess, and report on the implementation of program plans. Such evaluations provide feedback throughout a program’s implementation and later report on the extent to which the program was carried out as intended and required.
• Product evaluations identify and assess a program’s costs and outcomes — intended and unintended, short term and long term. These evaluations provide feedback during a program’s implementation on the extent that program goals are being addressed and achieved; at the program’s end impact evaluations identify and assess the program’s full range of accomplishments. The key questions are: Did the program achieve its goals? Did it successfully address the targeted needs and problems? What were any unexpected outcomes, both positive and negative? Were the program’s outcomes worth their cost?
CIPP Evaluation
• Formative Evaluation: An evaluation that proactively assesses a program from start to finish. It regularly issues feedback to assist the formulation of goals and priorities, provide direction for planning by assessing alternative courses of action and draft plans, guide program management by assessing and reporting on implementation of plans and interim results, and supply a record of collected formative information and how it was used.
• Summative Evaluation: A comprehensive evaluation of a program after it has been completed. It draws together and supplements previous evaluative information to provide an overall judgment of the program’s value. Such evaluations help interested audiences decide whether a program—refined through development and formative evaluation—achieved its goals, met targeted needs, constitutes a significant contribution in the program’s substantive, and is worth what it cost.
CIPP EvaluationEvaluation
Roles
Types of Evaluation
Context Input Process ProductFormative: Proactive
application of
descriptive and
judgmental
information to assist
decision making,
program
implementation,
quality assurance,
and accountability
Guidance for
identifying needed
interventions,
choosing goals, and
setting priorities by
assessing and
reporting on needs,
problems, risks,
assets, and
opportunities
Guidance for choosing a program
strategy (and possibly an outside
contractor) and settling on a
sound implementation plan and
budget by assessing and
reporting on alternative
strategies and resource allocation
plans and subsequently closely
examining and judging the
operational plan and budget
Guidance for executing
the operational plan by
monitoring,
documenting, judging,
and repeatedly
reporting on program
activities and
expenditures
Guidance for continuing, modifying,
certifying, or terminating the program
by identifying, assessing, and
reporting on intermediate and longer
term outcomes, including side effects
Summative:
Retroactive use of
descriptive and
judgmental
information to sum
up the program’s
value, e.g., its
quality, efficiency,
cost, practicality,
safety, impact, and
significance
Judging goals and
priorities by
comparing them to
assessed needs,
problems, risks,
assets, and
opportunities
Judging the implementation plan
and budget by comparing them
to targeted needs, problems, and
risks; contrasting the plan and
budget with critical competitors;
and assessing their compatibility
with the implementation
environment and compliance
with relevant codes, regulations,
and laws
Judging program
execution by fully
describing and assessing
the actual process and
costs, comparing the
planned and actual
processes and costs,
and assessing
compliance with
relevant codes,
regulations, and laws
Judging the program’s success by
comparing its outcomes and side
effects to targeted goals, needs,
problems, and risks; examining its
cost-effectiveness; and, as feasible,
contrasting its costs and outcomes
with competitive programs; also
interpreting results against the effort’s
outlay of resources and the extent to
which the operational plan was both
sound and effectively executed
WSMR
Personnel, Operational SOPs, & Instrumentation
A Simplified View of a Process Model for Mission Execution
Process Analysis & Performance Information System
Management
Performance Evaluation & Metrics Capture
Delivered Products
Improved Capabilities
Mission Execution ProcessCustomer Request
Improvement
Process Support Elements(Personnel, Tools & Infrastructure)
A Quick Digression to Look at the Effectiveness of the Feedforward-Feedback Process Model
Precision Control of a Defined Process “G(s)”(Machine tool, missile tracker, etc.)
Confirmation of the Feedforward-Feedback Control Architecture
E s R s Y s0 ( ) ( ) ( )
E s R s s K sK R s G s KK
sE s G s
sK G s Y s
aff vff p
i
d
0
2
0( ) ( ) ( ) ( ) ( ) ( )
( ) ( )
)()()(
)()()()()()(
0
0
2
0
sEsRsGsK
sGsEs
KKsGsRsKKssRsE
d
ipvffaff
)()()()(
)()()()()()(
0
0
2
0
sEsGsKsRsGsK
sGsEs
KKsGsRsKKssRsE
dd
ipvffaff
E s
s K sK sK G s R s
KK
ssK G s
aff vff d
p
i
d
0
21
1
( )( ) ( )
( )
E ss K s K K s K K s K K s R s
s K s K K s K KK K
s
aff vff d
d p
i0
2
2 1
2
1 1
2
2 1 1
1
( )( )
E ss K s K K s K K s K K s R s
s K s K K s K K s K K
aff vff d
d p i
0
3
2
2
1
3
1
2
1
2
3
2
2
1
2
1 1
( )( )
E sK K s K K K K K s R s
s K K K s K K s K K
aff vff d
d p i
0
1
3
2 1 1
2
3
2 1
2
1 1
1( )
( )
KKaff 1
1
KK
KKvff d
2
1
lim( )
( )
ssE s
K K s K K K K K s sR s
s K K K s K K s K K
aff vff d
d p i
0
1
0
1
3
2 1 1
2
3
2 1
2
1 1
Steady State Errors:
Confirmation of a Generalized Feedforward-Feedback System Process Configuration