Black Belt Control Tollgate Briefing UNCLASSIFIED / FOUO UNCLASSIFIED / FOUO Project Name DEPMS Project Number Name of Belt Organization Date
Jan 20, 2015
Black Belt Control
Tollgate Briefing
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
Project NameDEPMS Project Number
Name of Belt
OrganizationDate
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
2
Tollgate Requirements - Control
PROJECT DELIVERABLES NGB COMMENTS
CONTROL
Updated Financial / Operational Benefits Mandatory
Standardize Process/ SOPs/Training plans Mandatory Varies by project
Process Owner Accountibility Mandatory Includes transition plan
Achievement of Results / New Baseline Mandatory % goal achievement
Process Control Plans Mandatory Dashboard
Replication Opportunities Recommended Leverage to other areas
Visual Process Controls Recommended Varies by project
Mistake Proofing Tools Recommended Varies by project
Storyboard / A3 Mandatory 1-page proj summary
Barriers/Issues/Risks Mandatory
Quick Wins Recommended
Lessons Learned Optional
NG CPI BLACK BELT TOLLGATE REQUIREMENTS
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
3
NOTE: THIS IS A TEMPLATE FOR ALL NG CPI BELTS
NG has developed this template as a basic format with standard deliverables to
help guide NG CPI belts through the NG tollgate requirements for certification.
It is recognized that each project is unique and has unique deliverables with
unique flows. Therefore, this format does not have to be followed exactly to the
letter of the law for your project.
(DELETE THIS SLIDE FOR YOUR PROJECT)
Control Tollgate Templates
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
4
Project CharterProblem Statement:
Business Case:
Goal Statement:
Unit:
Defect:
Customer Specifications:
Process Start:
Process Stop:
Scope:
Project TimelinePhase Start Stop Status
Define mm/dd/yy mm/dd/yy
Measure mm/dd/yy mm/dd/yy
Analyze mm/dd/yy mm/dd/yy
Improve mm/dd/yy mm/dd/yy
Control mm/dd/yy mm/dd/yy
Take away
message goes
here
(impact of
problem)
Team Members
Name Role Affiliation DACI
Black Belt Driver
Master Black Belt Driver
Project Sponsor Approver
Process Owner Approver
Required Deliverable
Define Charter and Timeline
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
5
Measure Overview
VOC / VOB
Unit (d) or Mean (c)
Defect (d) or St. Dev. (c)
DPMO (d)
PCE: (Cycle Time Only)
PLT: (Cycle Time Only)
Sigma Quality Level
MSA Results: show the percentage result of the GR&R or other measurement systems analysis carried out in the project
Baseline Statistics
Tools Used
Process Capability
Baseline “As Is” Performance
Detailed process mapping
Measurement Systems Analysis
Value Stream Mapping
Data Collection Planning
Basic Statistics
Process Capability
Histograms
Time Series Plot
Probability Plot
Pareto Analysis
Operational Definitions
5s
Generic Pull
Control Charts
343230282624
Median
Mean
29.429.329.229.129.028.928.8
A nderson-Darling Normality Test
V ariance 7.169
Skewness 0.201075
Kurtosis -0.471714
N 266
Minimum 24.000
A -Squared
1st Q uartile 27.000
Median 29.000
3rd Q uartile 31.000
Maximum 35.000
95% C onfidence Interv al for Mean
28.805
1.95
29.451
95% C onfidence Interv al for Median
29.000 29.000
95% C onfidence Interv al for StDev
2.468 2.927
P-V alue < 0.005
Mean 29.128
StDev 2.677
95% Confidence Intervals
Summary for Delivery Time
36322824201612
LSL Target USL
Process Data
Sample N 266
StDev (Within) 2.87033
StDev (O v erall) 2.69154
LSL 10
Target 20
USL 30
Sample Mean 29.1203
Potential (Within) C apability
C C pk 1.16
O v erall C apability
Pp 1.24
PPL 2.37
PPU 0.11
Ppk
C p
0.11
C pm 0.35
1.16
C PL 2.22
C PU 0.10
C pk 0.10
O bserv ed Performance
PPM < LSL 0.00
PPM > USL 281954.89
PPM Total 281954.89
Exp. Within Performance
PPM < LSL 0.00
PPM > USL 379619.67
PPM Total 379619.67
Exp. O v erall Performance
PPM < LSL 0.00
PPM > USL 371895.18
PPM Total 371895.18
Within
Overall
Process Capability of Delivery Time
- Example -
Required Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
6
Analyze Overview
Rating of
Importance to
Customer
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
Requ
ireme
nt
Requ
ireme
nt
Requ
ireme
nt
Requ
ireme
nt
Requ
ireme
nt
Requ
ireme
nt
Requ
ireme
nt
Requ
ireme
nt
Requ
ireme
nt
Requ
ireme
nt
Requ
ireme
nt
Requ
ireme
nt
Requ
ireme
nt
Requ
ireme
nt
Requ
ireme
nt
Total
Process Step Process Input
1 0
2 0
3 0
4 0
5 0
6 0
7 0
8 0
9 0
10 0
11 0
12 0
13 0
14 0
15 0
16 0
17 0
18 0
19 0
20 0
0
Total 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Lower Spec
Target
Upper Spec
This table provides the initial input to the FMEA. When each of the output variables
(requirements) are not correct, that represents potential "EFFECTS". When each input
variable is not correct, that represents "Failure Modes".
1. List the Key Process Output Variables
2. Rate each variable on a 1-to-10 scale to importantance to the customer
3. List Key Process Input Variables
4. Rate each variables relationship to each output variable on a 1-to-10 scale
5. Select the top input variables to start the FMEA process; Determine how each selected
input varable can "go wrong" and place that in the Failure Mode column of the FMEA.
Fishbone Diagram
Root Cause/Effect
Cause and Effect Matrix
Critical X/Root Causes Analysis
Lack of Seats
Lack of Funds
Delays in elevating
Impasse issues
(Type of Space)
(Y) Effect:
PLT = 5 days
(too long)
Facilities & EquipmentFacilities & EquipmentManpowerManpower
Mother Nature
Unforeseen
Circumstances
Materials
Methods Measurements
No Standardization of seats
Getting Seats Takes Time
Lack of Controls
Lack of Controls
Multiple Paths
Inequality in seats
PeopleFacilities
Lack of Database
Collocation
Unplanned Programs
Senior Leadership
Mold, HVAC Crashes
Competency vs. PMA
CAO/IPT
Too Long (Time)
Lack of Knowledge
“Dedicated” to Task
Approvals
New Codes
Old Buildings
Wrong Location
Not Suited for
Current Mission
Space
No Suitable space to Assign
Time Avail
to Wait
Vague
Reqmts
Funding Decision
(Competing forSame Space)
Location
Senior Leader
Root cause:
Effect
Root cause:
Effect
Root cause:
Effect
SouthNorth
EastOthers
100 50 15 6
58.5 29.2 8.8 3.5
58.5 87.7 96.5 100.0
0
50
100
150
0
20
40
60
80
100
Defect
CountPercentCum %
Perc
ent
Count
Pareto Chart
- Example -
Required Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
7
Improve Overview
FMEA
New Process Measurements
Solution Matrix
Implementation Plan
New Process PLT: (If applicable)
New Process SQL:
New DPMO:
Process Step /
Input
Potential Failure
Mode (X)
Potential Failure
Effects (Y)
Potential Root
Causes Current Controls
Actions
RecommendedResp. Actions Taken
What is the
process step
and Input
under
investigation?
In what ways does
the Key Input go
wrong?
What is the impact
on the Key Output
Variables (Customer
Requirements)?
What causes the Key
Input to go wrong?
What are the existing
controls and procedures
(inspection and test) that
prevent either the cause
or the Failure Mode?
What are the
actions for reducing
the occurrence of
the cause, or
improving
detection?
What are the
completed actions
taken with the
recalculated
RPN?
Updating
Tollgates
Ineffective
templatesIneffective reviews 5
Discrepancies: POI vs
Templates4 None 5 100
Adjust templates to
match POIPMO 5 2 2 20
Users and leaders
don't buy-in to LSS
Slide purposes not
clear4 None 4 80
Adjust slide titles
and notesPMO 1 2 10
Redundant and NVA
slides3 None 4 60
Eliminate NVA
slidesPMO 1 2 10
Incomplete SOP or
"Help" within PS3 None 5 75
Develop "read me"
slidesPMO 1 2 10
Updating
Tollgates
Too many steps to
build/updateInefficient updating 3 NVA steps 5 None 4 60
Link templates to
PS PhasePMO 3 3 2 18
Users get frustrated
and delay projects
Too many choices
between templates4 None 4 48
Eliminate NVA;
group in foldersPMO 2 2 12
Inconsistent file
names and locations3 None 4 36
Simple names;
group in foldersPMO 1 2 6
LSS Tool
Access
Not all LSS tools &
refs in PS
User cannot find
tools & references4
Not all tools available
in PS5 None 4 80
Revise list of tools
and joggersPMO 4 2 2 16
Project completion is
delayed
Poor explanation in
some references3 None 3 36
Develop direct
access pdf filePMO 2 2 16
LSS Tool
Access
Too many steps to
retrieve tools
Inefficient retrieval of
LSS tools/refs2
Multiple means for
accessing tools3 None 5 30
"Read me" file; one
folderPMO 2 2 2 8
NVA steps 3 None 5 30Eliminate NVA
stepsPMO 2 2 8
S
E
V
E
R
I
T
Y
O
C
C
U
R
R
E
N
C
E
D
E
T
E
C
T
I
O
N
RPN
S
E
V
E
R
I
T
Y
O
C
C
U
R
R
E
N
C
E
D
E
T
E
C
T
I
O
N
R
P
N
R
oo
t C
au
se
s (
Xs
)?
T
ime
to
Is
su
e I
nv
oic
e
C
om
ple
te
A
cc
ura
cy
P
res
en
tati
on
P
er
Co
mm
erc
ial
Te
rms
L
ev
el
of
Eff
ort
Root Cause Significance Rating ? 10 10 10 10 10 10
Potential Improvements ? Impact Rating
Offshore Costs 7 1 5 8 1 10 320 8
Commercial Terms 8 4 2 5 10 7 360 7
Quantity of Source Data 8 5 7 7 1 10 380 6
Reconciliation 10 10 10 1 10 10 510 6
Quality of Source Data 7 7 7 7 7 7 420 7
Training 8 7 10 5 8 10 480 6
Client (eg RCTI) 6 2 2 10 8 8 360 5
Job Setup 10 4 10 6 10 10 500 5
Payroll Close Date 8 10 1 1 1 8 290 8
Pre-Billing 10 8 6 8 2 10 440 5
Client Reporting Requirements 7 9 4 10 10 10 500 4
Delivery Method 9 1 1 5 5 7 280 9
Job Completion 7 10 1 7 5 10 400 3
Job Manager Requirements 9 7 5 4 8 9 420 2
O
ve
rall
Im
pa
ct
Ra
tin
g
R
isk
Ra
tin
g
- Example -
PROJECT NAME DIVISION GREEN BELT DATE
PROJECT SPONSOR SERVICE AREA / FUNCTION / SERVICE BLACK BELT
Implementation
NumberSolution
Control Action
NumberImprovement Action
Responsible
Individual/
Solution Owner
Issues/Barriers Risk MitigationTarget/ Actual
Complete DateCurrent Status/ Comments
1
2
3
4
5
6
7
8
Required Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
8
Process Control/Reaction Plan
PROJECT NAME DIVISION DIVISION CHIEF DATE
12/18/2007
BRANCH CHIEF PROCESS OWNERS RESPONSIBLE CONTROL MANAGER
Freq. Process StepTarget
Value
Upper
Control
Limit
Lower
Control
Limit
Current Reaction Plan
1 Process Cycle Time Green Belt Quarterly Updating Tollgates 4 Min 5 Min N/A4 Min, 40
Sec
If process cycle time for updating templates rises
above 8 minutes, PMO will re-examine process to
identify constraints.
2
Conduct VOC Survey to
address issues in FMEA
(Improve Phase)
Green Belt Quarterly
Tollgate Templates
100%
Satisfaction100% 85% N/A
PMO willl adjust tollgate templates to address issues
raised by VOC.
3
Conduct VOC Survey to
address issues in FMEA
(Improve Phase)
Green Belt QuarterlyAccess to Tools
and References
100%
Satisfaction100% 85% N/A
PMO will adjust tools and references access to
issues raised by VOC.
LTC Garrett Heath
Applicable Control Charts and MetricsControl
Action
Number
Control ActionResponsible
Individual
LD09014 Improve Tollgate Templates LSS PMO
- Example -
Required Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
9
Standard Operating Procedures (SOPs) and Training Plans
SOPs Requiring Revision Responsible Status
FY09 SOP Update
1 Oct 08 – 28 Feb 09
Lead: Branch Chief,
AO: Team Lead
Completed
Next SOP Update schedule
1 Sep 09 – 30 Sep 10
Lead: TSC Chief,
AO: Team Lead
Complete before FY10 NSPS Cycle
Next SOP Update schedule
1 Sep 09 – 30 Sep 10
Lead: TSC Chief,
AO: Team Lead
Complete before FY11 NSPS Cycle
Required Training Responsible Status
Office Professional Development
Each Team Members are encouraged to provide OPD
OPD Training Schedule plan is asked to be updated to team members.
Individual Professional Development Plan
Branch Chief and Team members
Susp Date: Branch Chief and Team member discuss and finalize by 1 May 09
SOPs Requiring Revision
Required Training
- Example -
Required Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
10
Visual Process Control Tools
• SOP is posted computer desktop for easy reference access.
• Mistake proofing system in place - copy of DD214 w/checklist is
posted on the counselor’s desk
Sample DD 214
w/ checklist
- Example -
Recommended
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
11
Mistake Proofing (Poke Yoke) Tools
Solution: Make use of existing bar codes on shipping paperwork to mistake proof receiving function.
Validate with suppliers (shipping company) that all shipping containers will have barcode label ($0)
Purchase RF (wireless) bar coding equipment and software ($100/gun * 20 guns)
Train all longshoreman on use and care of equipment ($50/hr * 20 hrs)
Install error metrics for error tracking ($100)
- Example -
Recommended
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
12
Updated Benefits Estimate
Additional Benefits/Comments:
The solution was approved 15 May 09 and began full implementation on 22 May 09. It is presently being executed by the Process owner with no major issues or degradation of service to our customers.
Morale has improved significantly with the new layout of the office, as indicated by the recent office survey.
- Example -
Required Deliverable
Metric Baseline Objective Achieved
Cycle Time 30 days 7 days 5 days
Cost Savings $1.054M $850k $830k
Defect Rate
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
1313
Future Projects Identified
Based on the outcome of this project, the following is a list of “possible areas of concern” and will result in LSS Projects.
PQDR Process
QAR Surveillance Process
QAR Training Process
MRB Corrective Action Process
NDT Process
- Example -
Optional Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
14
Project Replication
Replication Project/Best Practice Candidate: This completed project is being submitted as a candidate
for replication. This project has achieved benefits and contains the appropriate level of documentation to enable another practitioner in another DoD organization to replicate part or all of the project with a minimal amount of effort vs. benefit, which will provide an additional return on investment to DoD.
Required documentation & benefits before submitting as a replication candidate:
Project Charter
High level Process Map or SIPOC
FMEA or Fishbone Diagram
Implementation Plan
Control Plan and Training Plan
Demonstrated Quality, Cost, or Speed improvements (i.e., SQL, ROI/Savings, PLT, PCE….)
This project may have replication potential within these DoD organizations:
Organization 1
Organization 2
…
Recommended
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
15
Project Barriers/Issues/Risks
Barriers
Issues
Risks
Required Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
16
Completed Project “Storyboard”Define
MeasureAnalyze
Sigma Performance Level of
1.3
Officer Work & Turnover, Waiting, & AutomationAffect CT; Job Aids affect Variation in CT
ImprovePilot Plan
BUS CASE: Be #2 Fin Service Provider
GOAL: Reduce Loan/Lease CT from9.2 to 8.0 days by July 1
FIN IMPACT: $2.7M per year
Project Charter1.2 Day CCR Gap
Loan or Lease
Screen Entry
Color Printouts
Rewards & Recog
“Officer performs
both” & “Officers
changed”,
eliminated as
contributors to high
cycle time.
Control
- Example -
Required Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
17
Company: Department: Date: Prepared by:
1. Define the problem situation
2. Problem Analysis
3. Action plans to correct problems
4. Results of Activity
5. Future Steps
8-Step A3 Project Summary Report
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
18
NG CPI Tollgate Tool
Control Charts Process Capability
Process Capability High Level Process Map (SIPOC)
(Finance Owner)(Finance Owner)
(MBB)(MBB)(MBB)(MBB)(MBB)
(Sponsor)(Sponsor)(Sponsor)(Sponsor)(Sponsor)
I accept the ControlTollgate
I accept the Improve Tollgate
I accept the Analyze Tollgate
I accept the MeasureTollgate
I accept the Define Tollgate
(Process Owner)(Process Owner)(Process Owner)(Process Owner)(Process Owner)
Project Charter• Problem Stmt• Defect Definition• Goal Statement• Project Scope• Business Impact• Strat Alignment
Detailed As Is Process Map
Value Stream Map Key Process Metrics
Potential Xs• Brainstorming• 5 Whys • Fishbone • Affinity Diagram
“To Be” Process Map
Solution Generation / Prioritization
Process Control Plan
Process Owner Accountability
Sponsor & Team
Replication Check
Data Collection Plan
Measure Systems
Analysis
Data Collected
Critical Xs • Cause & Effect Matrix• Hypothesis Testing• Regression• Time Studies• Theory of Constraints
Improvement Strategy• Improvement Model• Implementation Plan• Pilot • “X” Improvement Target
Updated Financial Benefits
Replication Opportunities
Measurable Y
Voice of Customer
Customer Specs
Voice of Business
Baseline Data Analysis
• Descriptive Stats
• Graphs
• Pareto Charts
FMEA• Risk Analysis • Risk Mitigation Plan
Mistake Proofing Project Documentation of revised policies, SOP’s, procedures, and training
Project Timeline
Communication Plan• Stakeholder Analysis
Est Financial Benefits
Control Charts (as needed)
FMEA• Risk Analysis • Risk Mitigation Plan
Visual Process Control Tools (Optional)
1. ValidateProblem
4. Find Root Cause
3. Set Targets
5. Develop C-Ms
6. See C-Ms Through
2. IdentifyGaps
7. Confirm Results
8. Standardize
Define Measure Analyze ControlImprove
8-STEP PROCESS
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
19
Appendix
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
20
Team MembersName Role Affiliation DACI
Black Belt Driver
Master Black Belt Driver
Project Sponsor Approver
Process Owner Approver
Contributor
Contributor
Contributor
Contributor
Inform
Inform
Inform
Inform
Required Deliverable
Cross Functional Team
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
21
I confirm that:
DEPMS (DoD Enterprise Performance Management System) has been searched for similar projects: Yes / No
Replication: List relevant initiatives / potential replication projects found (if any):
• Project 1: (DEPMS # or other tracking tool project number)
• Project 2:
Collaboration: Identify organizations that can/should be considered for working this project collaboratively:
• Organization 1:
• Organization 2:Required Deliverable
Replication Check
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
22
Strategic Alignment
The Define Tollgate requires a linkage to organizational strategy.
Include an organizational metric/metrics for which your project will help improve
Refer to your organization’s Strategic Plan and/or other referenced documents
Required Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
23
Business Impact
Insert as much information as possible about the potential operational and/or financial benefits
Include any assumptions upon which these estimates are based
Example: Operational benefits – This project is expected to reduce PLT by 35%, improve SQL from 1.2 to 3.0, save 20 man hours per shift
Example: Financial benefits – This project is expected to save $xx in FYxx
Required Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
24
Measurable Y:
Required Deliverable
Suppliers Inputs Process Outputs Customers
High-Level Process Map (SIPOC)
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
25
Voice of the Customer /
Key Customer Issue(s)Critical Customer
Requirement
What does the customer want from us? What does the customer want from us? We need to identify the issue(s) that
prevent us from satisfying our customers.
We should summarize key issues and translate them into specific and measurable
requirements
Required Deliverable
Voice of Customer / Voice of Business
Voice of the Business
Key Process Issue(s)CriticalBusiness
Requirement
What does the business want/need from us? What does the business want/need from us? We need to identify the issue(s) that prevent us from meeting strategic
goals/missions.
We should summarize key issues and translate them into specific and measurable
requirements
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
26
Stakeholder
Name/Group
Project Impact
On Stakeholder
(H, M, L)
Stakeholder
Level of
Influence on
Success of
Project (H,M,L)
Stakeholder’s
Current Attitude
Toward Project
( +, 0, - )
Explanation of
Current
Stakeholder
Attitude
(list)
Stakeholder
Score
(H=3, M=2, L=1,
+=3, 0=1, -=-3)
Action Plan
For
Stakeholder
Recommended Deliverable
Stakeholder Analysis
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
27
Audience Media PurposeTopics of
Discussion/
Key MessagesOwner Frequency Notes/Status
Communication Plan
Required Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
28
- Example -
Detailed “As Is” Process Map
Required Deliverable - VSM or Process Map or Both
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
29
Order Mgmt Supervisor
CUSTOMER
DISTOrder MgmtOrder MgmtOrder Mgmt Order Mgmt
Screen for Acct Mgr
Order Mgmt
P/T = 2 min
Error Rate=2%
Volume=800
P/T = 6 Min
Error Rate=0%
Volume=800
P/T = 6 Min
Error Rate=2%
Volume=800
P/T = 2 Min
Error Rate=1%
Volume=800
20 Orders
3 min
Phone Call
Large
Business
Home
6 Customers
5 Customers
3 Customers
Small
Business
Customer
Info4
Product
Need4Pricing
4
Shipping
Info4
P/T = 3 min
Lost calls=10%
Volume=1200
Pick
Pack & Ship
P/T = 120 Min
Error Rate=1%
Volume=1200
10
2 min 6 min 6 min 2 min 120 min
240 min5 min
Customer call time = 24 min
Service lead time = 384 min
SUPPLIERSManual Update
Weekly Update
Phone Call
Trigger:
Completion Criteria:
Cycle Time:
Takt Time:
Number of People:
Number of Approvals:
Items in Inbox:
% Rework:
# of Iterations (cycles):
# of Databases:
Top 3 Rework Issues:
1.
2.
3.
- Example -
Required Deliverable - VSM or Process Map or Both
Value Stream Map
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
30
Key Input, Process, and Output Metrics
Suppliers Inputs Process Outputs Customer
Start
Step1Step 2 Step 3
Step 4Step 5
Input Metrics Process Metrics Output Metrics
Quality
Speed
Cost
VOC/
VOB
Required Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
31
Operational Definitions
Define each of the Key Input, Output, Process Metrics from your SIPOC that you are going to collect data on (via the Data Collection Plan) as well as any other terms that need clarification for the data collectors and everyone else on the team.
Examples:
Award Process PLT: The time from when a Director submits the Award recommendation to the time when the employee is presented the Award in a ceremony.
Number of Claims Processed: The number of Claims processed per weekday (M-F).
Total Hours Worked: The total number of hours worked in the facility including weekends and holidays.
Number of Personnel: The total number of military and civilian personnel working (not including contractors).
Include other unique terms that apply to your project that require clear operational definitions for those collecting the data and for those interpreting the data.
Required
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
32
Data Collection PlanPerformance Measure
Operational Definition
Data Source and
Location
How Will Data Be Collected
Who Will Collect Data
When Will Data Be
Collected
Sample Size
Stratification Factors
How will data be used?
Ability to update projects and build tollgate reviews
X1
– Steps to update projects
In DEPMS By counting steps Name ASAP 1 None To find VA, BNVA, NVA
Ability to update projects and build tollgate reviews
X2
– Tollgate template slides that match POI
In DEPMS By determining % of activity steps identified in “Introduction to _____” modules in POI that are adequately addressed in templates
Name ASAP 40 None To determine consistency with POI
Easy Access to LSS tools and references
X3 – Availability of
LSS tools and references
In DEPMS By determining the percentage of tools, with their references, listed on DMAIC Road Map slides that can be found in PS
Name ASAP 63 None To determine availability of tools and references
Easy Access to LSS tools and references
X4
– Steps required to find tools and references
In DEPMS By counting # steps required to find the tools and their references
Name ASAP 37 None To find VA, BNVA, NVA
Required Deliverable
- Example -
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
33
Measurement Systems AnalysisThe Measurement System used to collect data has been calibrated and is considered to have no potential for significant errors. The data collection tool is reliable, can be counted on, has good resolution, shows no signs of bias and is stable.
Type of Measurement
ErrorDescription Considerations to this Project
Discrimination (resolution)
The ability of the measurement system to divide measurements into “data categories”
Work hours can be measured to <.25 hours. Radar usage measure to +- 2 minute.
BiasThe difference between an observed average measurement result and a reference value
No bias - Work hours and radar start-stop times consistent through population.
Stability The change in bias over timeNo bias of work hours and radar usage data.
Repeatability The extent variability is consistentNot an issue. Labor and radar usage is historical and felt to be accurate enough for insight and analysis.
ReproducibilityDifferent appraisers produce consistent results
Remarks in usage data deemed not reproducible, therefore were not considered in determining which radars were used in each op
Variation The difference between parts N/a to this process.Required Deliverable
- Example -
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
34
Measurement Systems Analysis
The Measurement System is acceptable with the Total Gage R&R % Contribution <10%
Percent
Part-to-PartReprodRepeatGage R&R
100
50
0
% Contribution
% Study Var
Sam
ple
Range 0.10
0.05
0.00
_R=0.0417
UCL=0.1073
LCL=0
1 2 3
Sam
ple
Mean
10.00
9.75
9.50
__X=9.7996
UCL=9.8422
LCL=9.7569
1 2 3
Part
10987654321
10.00
9.75
9.50
Operator
321
10.00
9.75
9.50
Part
Average
10 9 8 7 6 5 4 3 2 1
10.00
9.75
9.50
Operator
1
2
3
Gage name:
Date of study :
Reported by :
Tolerance:
Misc:
Components of Variation
R Chart by Operator
Xbar Chart by Operator
Response by Part
Response by Operator
Operator * Part Interaction
Gage R&R (ANOVA) for ResponseGage R&R
%Contribution
Source VarComp (of VarComp)
Total Gage R&R 0.0015896 3.70
Repeatability 0.0005567 1.29
Reproducibility 0.0010330 2.40
Operator 0.0003418 0.79
Operator*Part 0.0006912 1.61
Part-To-Part 0.0414247 96.30
Total Variation 0.0430143 100.00
Study Var %Study Var
Source StdDev (SD) (6 * SD) (%SV)
Total Gage R&R 0.039870 0.23922 19.22
Repeatability 0.023594 0.14156 11.38
Reproducibility 0.032140 0.19284 15.50
Operator 0.018488 0.11093 8.91
Operator*Part 0.026290 0.15774 12.68
Part-To-Part 0.203531 1.22118 98.13
Total Variation 0.207399 1.24439 100.00
Number of Distinct Categories = 7
- Example -Optional BB Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
35
“As Is” Baseline Statistics
The current process has a non-normal distribution with the P-Value < 0.05
Mean = 44 days
Median = 22 days
Std Dev = 61 days
Range = 365 days
Required Deliverable
360300240180120600
Median
Mean
6050403020
1st Q uartile 12.000
Median 22.000
3rd Q uartile 52.000
Maximum 365.000
33.647 55.981
17.000 29.123
54.308 70.246
A -Squared 12.65
P-V alue < 0.005
Mean 44.814
StDev 61.251
V ariance 3751.674
Skewness 2.87329
Kurtosis 9.54577
N 118
Minimum 1.000
A nderson-Darling Normality Test
95% C onfidence Interv al for Mean
95% C onfidence Interv al for Median
95% C onfidence Interv al for StDev95% Confidence Intervals
Summary for Workdays
- Example -
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
36
Process Control Chart
The current baseline delivery time is stable over time with both the Moving Range (3.22 days) and Individual Average (29.13 days) experiencing common cause variation
255 data points collected with zero subgroups, thus the I&MR control chart selected
Observation
In
div
idu
al
Va
lue
2442171901631361098255281
40
35
30
25
20
_X=29.13
UC L=37.70
LC L=20.56
Observation
Mo
vin
g R
an
ge
2442171901631361098255281
10.0
7.5
5.0
2.5
0.0
__MR=3.22
UC L=10.53
LC L=0
I-MR Chart of Delivery Time
Required As Applicable- Example -
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
37
118 data points collected
Non-normal distribution
Mean = 44 days
Lower Cust Spec = 0 days
Upper Cust Spec = 15 days
65% of observations outside customer spec
Z Bench = -.31
Required Deliverable
- Example -
Process Capability
420360300240180120600
LSLUSL
LSL 0
Target *
USL 15
Sample Mean 44.8136
Sample N 118
Location 3.09501
Scale 1.26378
Process Data
Z.Bench -0.31
Z.LSL 3.07
Z.USL -0.02
Ppk -0.01
O v erall C apability
% < LSL 0.00
% > USL 65.25
% Total 65.25
O bserv ed Performance
% < LSL 0.00
% > USL 62.03
% Total 62.03
Exp. O v erall Performance
Process Capability of WorkdaysCalculations Based on Lognormal Distribution Model
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
38
Process Constraint ID Analysis
Takt Rate Analysis compares the task time of each process (or process step) to other steps and customer demand to determine if the time trap is the constraint
Value Add Analysis - Current State
0
10
20
30
40
50
60
70
80
1 2 3 4 5 6 7 8 9 10
Task #
Task T
ime (
seco
nd
s)
CVA Time NVA-R Time NVA Time
Takt Time = 55
Takt Rate = Customer Demand Rate =Net Process Time AvailableNumber of Units to Process
Takt Time = Number of Units to ProcessNet Process Time Available
- Example -
BB Optional Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
39
SouthNorth
EastOthers
100 50 15 6
58.5 29.2 8.8 3.5
58.5 87.7 96.5 100.0
0
50
100
150
0
20
40
60
80
100
Defect
CountPercentCum %
Perc
ent
Count
Pareto Chart
Pareto Plot Analysis
The South and North contain over 80% of the defects. Our
project will focus here and not on the East and West.
- Example -
Optional Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
40
Cause & Effect Diagram (Fishbone)
Lack of Seats
Lack of Funds
Delays in elevating
Impasse issues
(Type of Space)
(Y) Effect:
PLT = 5 days
(too long)
Facilities & EquipmentManpower
Mother Nature
Unforeseen
Circumstances
Materials
Methods Measurements
No Standardization of seats
Getting Seats Takes Time
Lack of Controls
Lack of Controls
Multiple Paths
Inequality in seats
PeopleFacilities
Lack of Database
Collocation
Unplanned Programs
Senior Leadership
Mold, HVAC Crashes
Competency vs. PMA
CAO/IPT
Too Long (Time)
Lack of Knowledge
“Dedicated” to Task
Approvals
New Codes
Old Buildings
Wrong Location
Not Suited for
Current Mission
Space
No Suitable space to Assign
Time Avail to
Wait
Vague
Reqmts
Funding Decision
(Competing forSame Space)
Location
Senior Leader
Required Deliverable
- Example -
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
41
XY Matrix (Root Cause Analysis)
Rating of
Importance to
Customer
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
Requ
ireme
nt
Requ
ireme
nt
Requ
ireme
nt
Requ
ireme
nt
Requ
ireme
nt
Requ
ireme
nt
Requ
ireme
nt
Requ
ireme
nt
Requ
ireme
nt
Requ
ireme
nt
Requ
ireme
nt
Requ
ireme
nt
Requ
ireme
nt
Requ
ireme
nt
Requ
ireme
nt
Total
Process Step Process Input
1 0
2 0
3 0
4 0
5 0
6 0
7 0
8 0
9 0
10 0
11 0
12 0
13 0
14 0
15 0
16 0
17 0
18 0
19 0
20 0
0
Total 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Lower Spec
Target
Upper Spec
This table provides the initial input to the FMEA. When each of the output variables
(requirements) are not correct, that represents potential "EFFECTS". When each input
variable is not correct, that represents "Failure Modes".
1. List the Key Process Output Variables
2. Rate each variable on a 1-to-10 scale to importantance to the customer
3. List Key Process Input Variables
4. Rate each variables relationship to each output variable on a 1-to-10 scale
5. Select the top input variables to start the FMEA process; Determine how each selected
input varable can "go wrong" and place that in the Failure Mode column of the FMEA.
Required Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
42
Hypothesis Test Summary
Hypothesis Test (ANOVA, 1 or 2 sample t - test, Chi Squared,
Regression, Test of Equal Variance, etc)
Factor (x)
Testedp Value Observations/Conclusion
Example: ANOVA Location 0.030
Significant factor - 1 hour driving time from DC
to Baltimore office causes ticket cycle time to
generally be longer for the Baltimore site
Example: ANOVA Part vs. No Part 0.004
Significant factor - on average, calls requiring
parts have double the cycle time (22 vs 43
hours)
Example: Chi Squared Department 0.000
Significant factor - Department 4 has digitized
addition of customer info to ticket and less
human intervention, resulting in fewer errors
Example: Pareto Region n/a
South region accounted for 59% of the defects
due to their manual process and distance from
the parts warehouse
Describe any other observations about the root cause (x) data
Optional BB Deliverable- Example -
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
43
No P
art
Part
0
50
100
150
Part/No Part
Net H
ours
Call
Open
Boxplots of Net Hour by Part/No
(means are indicated by solid circles)
Analysis of Variance for Net Hour
Source DF SS MS F P
Part/No 1 7421 7421 8.65 0.004
Error 69 59194 858
Total 70 66615
Individual 95% CI's For Mean
Level N Mean StDev --+---------+---------+---------+----
No Part 27 21.99 19.95 (--------*---------)
Part 44 43.05 33.70 (------*------)
--+---------+---------+---------+----
Pooled StDev = 29.29 12 24 36 48
After further investigation, possible reasons proposed by the team are OEM backorders, lack of technician certifications and the distance from the OEM to the client site. It is also caused by the need for technicians to make a second visit to the end user to complete the part replacement. Next step will be for the team to confirm these suspected root causes.
Boxplot: Part/ No Part Impact on Ticket Cycle Time
Because the p-value <= 0.05, we can be confident that calls requiring parts do have an impact on the ticket cycle time.
One-Way ANOVA: Root Cause Verification
Optional BB Deliverable
- Example -
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
44
Linear Regression
95% confident that 94.1% of the variation in “Wait Time” is from the “Qty of Deliveries”
Deliveries
Wa
it T
ime
353025201510
55
50
45
40
35
S 1.11885
R-Sq 94.1%
R-Sq(adj) 93.9%
Fitted Line PlotWait Time = 32.05 + 0.5825 Deliveries
Optional BB Deliverable
- Example -
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
45
Solution Selection Matrix
R
oo
t C
au
se
s (
Xs
)?
T
ime
to
Is
su
e I
nv
oic
e
C
om
ple
te
A
cc
ura
cy
P
res
en
tati
on
P
er
Co
mm
erc
ial
Term
s
L
ev
el
of
Eff
ort
Root Cause Significance Rating ? 10 10 10 10 10 10
Potential Improvements ? Impact Rating
Offshore Costs 7 1 5 8 1 10 320 8
Commercial Terms 8 4 2 5 10 7 360 7
Quantity of Source Data 8 5 7 7 1 10 380 6
Reconciliation 10 10 10 1 10 10 510 6
Quality of Source Data 7 7 7 7 7 7 420 7
Training 8 7 10 5 8 10 480 6
Client (eg RCTI) 6 2 2 10 8 8 360 5
Job Setup 10 4 10 6 10 10 500 5
Payroll Close Date 8 10 1 1 1 8 290 8
Pre-Billing 10 8 6 8 2 10 440 5
Client Reporting Requirements 7 9 4 10 10 10 500 4
Delivery Method 9 1 1 5 5 7 280 9
Job Completion 7 10 1 7 5 10 400 3
Job Manager Requirements 9 7 5 4 8 9 420 2
O
ve
rall
Im
pa
ct
Rati
ng
R
isk
Ra
tin
g
* Solutions ranked > 450 have been selected to be implemented
Required Deliverable
- Example -
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
46
Implementation Plan
Required Deliverable
PROJECT NAME DIVISION GREEN BELT DATE
PROJECT SPONSOR SERVICE AREA / FUNCTION / SERVICE BLACK BELT
Implementation
NumberSolution
Control Action
NumberImprovement Action
Responsible
Individual/
Solution Owner
Issues/Barriers Risk MitigationTarget/ Actual
Complete DateCurrent Status/ Comments
1
2
3
4
5
6
7
8
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
47
Pilot Plan
Hand-Chek/ Hot-Chek Interface Test
• Sample Check-in Data Sets to be entered in Hand-Chek device
• Sample Data Sets Transmitted to Hot-Chek System – All Hotel Floors, All Hotel Rooms
• Confirmation Data Received from Hot-Chek to Hand-Chek Device – All Hotel Floors, All Hotel Rooms
• Data Set Entry Accuracy < 3.4 DPMO
• Data Set Entry Time < 6 Seconds
• Data Set Transmission/ Reception Accuracy < 3.4 DPMO
BR, KM, plus Hot-Chektech rep
Start 3/1Complete 3/3
Check-in Verification Test
• Sample Guest Data Entered in Hot-Chek System (variety of room requirements)
• “Guests” (Hotel Employees) Walked Through Check-in Process (90% Pre-Registered, 10% Non-Pre-Registered)
• Volume Stress Test – Simulated Arrival 20 Guests in a “Tour Bus”
• Process Measurements recorded via Observer (see Design Scorecard); “Guest” Observations Recorded.
• Data Set Entry Accuracy < 3.4 DPMO
• Data Set Entry Time < 6 Seconds
• Data Set Transmission/ Reception Accuracy < 3.4 DPMO
• Design Scorecard CCRs
BR, KM, + 6 Check-in Staff
Start 3/6Complete 3/7
Check-in Validation Test
• 25 Guests invited to experience new hotel check-in p
• Guests “pre-registered” with their room requirements in Hot-Chek system.
• Guests Walked Through Check-in Process (90% Pre-Registered, 10% Non-Pre-Registered)
• Process Measurements recorded via Observer (see Design Scorecard)
• Guests Debriefed Following Experience.
• Data Set Entry Accuracy < 3.4 DPMO
• Data Set Entry Time < 6 Seconds
• Data Set Transmission/ Reception Accuracy < 3.4 DPMO
• Design Scorecard CCRs
BR, KM, + 6 Check-in Staff
Start 3/10Complete 3/10
ScheduleTest TeamSuccess CriteriaDescriptionPilot
Test
- Example -
Recommended Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
48
Pilot Results
Pilot Observations: 1) Data Entry sequence was confusing
GAP Analysis/Root Causes: 1) SOP wasn’t clear; need to lay it out better
before implementation
2) Order of questions needs to be reevaluated
Follow-up Actions: 1) Revise SOP on order of questions asked and flow, and run pilot again
Data Collected:
MeasurePilot
sPlan
Target x
PLT
Data Accuracy
1 minute
< 3.4 DPMO
0.5 min.
100 DPMO
0.05 min.
Comments
Improved PLT
Decreased DPMO
PCE < 15 %< 10 % Improved PCE
- Example -
Recommended Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
49
Failure Mode Effects Analysis (FMEA)
Process Step /
Input
Potential Failure
Mode (X)
Potential Failure
Effects (Y)
Potential Root
Causes Current Controls
Actions
RecommendedResp. Actions Taken
What is the
process step
and Input
under
investigation?
In what ways does
the Key Input go
wrong?
What is the impact
on the Key Output
Variables (Customer
Requirements)?
What causes the Key
Input to go wrong?
What are the existing
controls and procedures
(inspection and test) that
prevent either the cause
or the Failure Mode?
What are the
actions for reducing
the occurrence of
the cause, or
improving
detection?
What are the
completed actions
taken with the
recalculated
RPN?
Updating
Tollgates
Ineffective
templatesIneffective reviews 5
Discrepancies: POI vs
Templates4 None 5 100
Adjust templates to
match POIPMO 5 2 2 20
Users and leaders
don't buy-in to LSS
Slide purposes not
clear4 None 4 80
Adjust slide titles
and notesPMO 1 2 10
Redundant and NVA
slides3 None 4 60
Eliminate NVA
slidesPMO 1 2 10
Incomplete SOP or
"Help" within PS3 None 5 75
Develop "read me"
slidesPMO 1 2 10
Updating
Tollgates
Too many steps to
build/updateInefficient updating 3 NVA steps 5 None 4 60
Link templates to
PS PhasePMO 3 3 2 18
Users get frustrated
and delay projects
Too many choices
between templates4 None 4 48
Eliminate NVA;
group in foldersPMO 2 2 12
Inconsistent file
names and locations3 None 4 36
Simple names;
group in foldersPMO 1 2 6
LSS Tool
Access
Not all LSS tools &
refs in PS
User cannot find
tools & references4
Not all tools available
in PS5 None 4 80
Revise list of tools
and joggersPMO 4 2 2 16
Project completion is
delayed
Poor explanation in
some references3 None 3 36
Develop direct
access pdf filePMO 2 2 16
LSS Tool
Access
Too many steps to
retrieve tools
Inefficient retrieval of
LSS tools/refs2
Multiple means for
accessing tools3 None 5 30
"Read me" file; one
folderPMO 2 2 2 8
NVA steps 3 None 5 30Eliminate NVA
stepsPMO 2 2 8
S
E
V
E
R
I
T
Y
O
C
C
U
R
R
E
N
C
E
D
E
T
E
C
T
I
O
N
RPN
S
E
V
E
R
I
T
Y
O
C
C
U
R
R
E
N
C
E
D
E
T
E
C
T
I
O
N
R
P
N
Required Deliverable
- Example -
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
50
“New” Process Capability“As Is” Process Capability
Required Deliverable
240220200180160140
USLUSL
Process Capability Analysis for Control
PPM Total
PPM > USL
PPM < LSL
PPM Total
PPM > USL
PPM < LSL
PPM Total
PPM > USL
PPM < LSL
Ppk
PPL
PPU
Pp
Cpm
Cpk
CPL
CPU
Cp
StDev (LT)
StDev (ST)
Sample N
Mean
LSL
Target
USL
16153.51
16153.51
*
43119.06
43119.06
*
0.00
0.00
*
0.71
*
0.71
*
*
0.57
*
0.57
*
16.3662
20.4206
30
184.967
*
*
220.000
Expected LT PerformanceExpected ST PerformanceObserved PerformanceOverall (LT) Capability
Potential (ST) Capability
Process Data
ST
LT
260240220200180160140120
USLUSL
Process Capability Analysis for Cholesterol
PPM Total
PPM > USL
PPM < LSL
PPM Total
PPM > USL
PPM < LSL
PPM Total
PPM > USL
PPM < LSL
Ppk
PPL
PPU
Pp
Cpm
Cpk
CPL
CPU
Cp
StDev (LT)
StDev (ST)
Sample N
Mean
LSL
Target
USL
116152.65
116152.65
*
151146.50
151146.50
*
133333.33
133333.33
*
0.40
*
0.40
*
*
0.34
*
0.34
*
22.4931
26.0455
30
193.133
*
*
220.000
Expected LT PerformanceExpected ST PerformanceObserved PerformanceOverall (LT) Capability
Potential (ST) Capability
Process Data
ST
LT
240220200180160140
USLUSL
Process Capability Analysis for Control
PPM Total
PPM > USL
PPM < LSL
PPM Total
PPM > USL
PPM < LSL
PPM Total
PPM > USL
PPM < LSL
Ppk
PPL
PPU
Pp
Cpm
Cpk
CPL
CPU
Cp
StDev (LT)
StDev (ST)
Sample N
Mean
LSL
Target
USL
16153.51
16153.51
*
43119.06
43119.06
*
0.00
0.00
*
0.71
*
0.71
*
*
0.57
*
0.57
*
16.3662
20.4206
30
184.967
*
*
220.000
Expected LT PerformanceExpected ST PerformanceObserved PerformanceOverall (LT) Capability
Potential (ST) Capability
Process Data
ST
LT
260240220200180160140120
USLUSL
Process Capability Analysis for Cholesterol
PPM Total
PPM > USL
PPM < LSL
PPM Total
PPM > USL
PPM < LSL
PPM Total
PPM > USL
PPM < LSL
Ppk
PPL
PPU
Pp
Cpm
Cpk
CPL
CPU
Cp
StDev (LT)
StDev (ST)
Sample N
Mean
LSL
Target
USL
116152.65
116152.65
*
151146.50
151146.50
*
133333.33
133333.33
*
0.40
*
0.40
*
*
0.34
*
0.34
*
22.4931
26.0455
30
193.133
*
*
220.000
Expected LT PerformanceExpected ST PerformanceObserved PerformanceOverall (LT) Capability
Potential (ST) Capability
Process Data
ST
LT
Descriptive Statistics / Process Capability
- Example -
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
51
Control Chart
150100500
0.10
0.05
0.00
Sample Number
Pro
po
rtio
n
P Chart for Total Defectives
P=0.03817
3.0SL=0.08162
-3.0SL=0.00E+00
Feb/Mar Data Confirms Process Has
Remained In Control
- Example -
Recommended Deliverable
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
52
“As Is” Process
“As Is” vs. “To Be” Process Map
To Be Process
Required Deliverable
- Example -
UNCLASSIFIED / FOUO
UNCLASSIFIED / FOUO
53
Related Project Consideration
Multi-Generation Project Plan (MGPP)
Generation 1(Date)
Generation 2(Date)
Generation 3(Date)
Vision
Process Generation
Platforms /
Technology
Optional Deliverable