National Aeronautics and Space Administration
Independent Programmatic Analysis forRadiation Budget Instrument (RBI) Project
At Preliminary Design Review (PDR)
NASA Cost Symposium 2016
August 23, 2016
Jonathan Drexler (GRC), Lead Programmatic AnalystBill Lawson (GSFC), Cost AnalystSteve Wilson (JSC), Schedule/JCL Analyst
www.nasa.gov
National Aeronautics and Space Administration
www.nasa.gov
Agenda
• Independent Reviews and Background
• Process and Procedures– Cost Assessment– Schedule Assessment– JCL Assessment
• Lessons Learned
National Aeronautics and Space Administration
www.nasa.gov 3
Background• On October 22nd, 2015: NASA Associate Administrator (AA) memo to
reorganize and realign the Agency's independent assessment function – Independent Project Assessment Office (IPAO) disbanded.– Independent assessment of programs and projects continue under the
responsibility and accountability of the Mission Directorates with support from the Centers.
• Radiation Budget Instrument (LaRC) – Late November/early December– First OCFO staffing requirement– Preliminary Design Review (PDR) in April ’16, including JCL
• Programmatic team identified:– Jonathan Drexler, GRC, Programmatic Lead– Bill Lawson, GSFC, Cost Analyst– Steve Wilson, JSC, Schedule/JCL Analyst
National Aeronautics and Space Administration
www.nasa.gov
Independent Assessments Overview
• The Standing Review Board (SRB) is an independent advisory board that performs the principle independent Life Cycle reviews.
– One SRB for each program or project– Conducts all 7120.5-required independent
reviews for that program or project throughout its life cycle
– Category 1 Projects & Category 2 projects with a life cycle cost > $250M
• SRBs provide the Agency a non-advocate, objective, and competent assessment of the Program/project as it advances through its key decision points (KDP)– RBI -> PDR then KDP-C
• SRB to objectively assesses:– Alignment with and contribution to Agency
strategic goals and adequacy of requirements flow down from those.
– Adequacy of management approach. – Adequacy of technical approach, as defined by
NPR 7123.1 entrance and success criteria. – Adequacy of the integrated cost and schedule
estimate and funding strategy in accordance with NPD 1000.5.
– Adequacy and availability of resources other than budget.
– Adequacy of the risk management approachand risk identification and mitigation per NPR 8000.4.
Review Manager
Subject Matter Expert (SME)
Chair
SME SME
National Aeronautics and Space Administration
www.nasa.gov
National Aeronautics and Space Administration
www.nasa.gov
RBI Other Key Information• New Instrument Development:
– Follow-on to CERES Instrument
– Mass: ~76 kg (CBE) / 90 kg (allocation)
– Power: ~67 W
• JCL (Tecolote on contract to implement)– Joint Analysis of Cost and Schedule (JACS), ~1600 lines Microsoft Project
– Based on Harris IMS, Langley tasks mainly Hammock tasks
• Schedule Considerations– Deterministic schedule includes 6.5 months of funded schedule margin
– All margin at the end
National Aeronautics and Space Administration
www.nasa.gov
Agenda
• Independent Reviews and Background
• Process and Procedures– Cost Assessment– Schedule Assessment– JCL Assessment
• Lessons Learned
National Aeronautics and Space Administration
www.nasa.gov
Cost Assessment• Objective: Develop a parametric, benchmark cost estimate then
compare that to RBI plan
• Master Equipment List (MEL) serves as the backbone of this analysis
• Commercial tool set used for RBI cost analysis– PRICE-H for Hardware, SEER-H for Focal Plane, SEER-SEM for Software,
ACEIT for generating S-Curve
• Consistent with the Project:‒ 1 Flight Unit‒ 1 Engineering Development Unit
National Aeronautics and Space Administration
www.nasa.gov
Desired Implementation Flow
Detailed (e.g, card level)Master Equipment List
(MEL)
Curveball: • 23 Line Master Equipment List
(MEL); not down to component level• MEL not a DRD• Needed to develop a workaround
National Aeronautics and Space Administration
www.nasa.gov
RBI Cost Assessment
Project
14 ModuleMass-Tree plus
PDR presentations,engineering judgment
Assessment: • Budget appears sufficient
National Aeronautics and Space Administration
www.nasa.gov
Schedule Assessment• Benchmark against Historical
– Data Set: ACIS, APS-Glory, CERES, EVE, GSPEC-OCO, HMI, HRC, IRIS, TOMS
• NICM Cross-check
• Health Check with STAT Tool
• Probabilistic Critical Path Assessment (*More later)
• Assessment -> Project schedule is aggressive
National Aeronautics and Space Administration
www.nasa.gov
JCL Assessment
• Use the Project JCL as the starting point− Inform SRB on Project assumptions
− Provide framework for SRB adjustments
• Two-step process with SRB members input1. Uncertainty – Cost & Schedule
2. JCL Risks
SRB
National Aeronautics and Space Administration
www.nasa.gov
JCL Uncertainty Assessment Framework• JCL assessment framework for uncertainty:
– Based on project’s JCL inputs– D&D activities end at A&T; A&T leads into I&T– Each block represents a group of tasks
• Framework for SRB member input– Don’t differentiate between cost and schedule uncertainty– Independent of discrete risks
Work ElementHW Design and
DevelopmentAssembly & Test (EDU)
Assembly & Test (FU)
Big
Seve
n Subsystem #1 Medium Medium HighSubsystem #2 Medium Medium HighSubsystem #3 Low Medium MediumSubsystem #4 Low Medium HighSubsystem #5 Low Medium MediumSubsystem #6 Medium Medium HighSubsystem #7 Medium Medium MediumSubsystem #8 Medium Medium MediumSubsystem #9 Low Low Low
Integration & Test #1 HighIntegration & Test #2 High
National Aeronautics and Space Administration
www.nasa.gov
JCL Uncertainty Score Descriptions
• Mapping of scoring-to-characteristic descriptions calibrates assessment
– Sourced from industry best practices and previous SRB and NASA project analyses
– Categories consistent with project
• Scoring Approach– Scores are a general reflection of
task/WBS item characteristics– Medium [2] is considered Nominal– More scoring bins above nominal based
upon trends from NASA and DOD projects
National Aeronautics and Space Administration
www.nasa.gov15
SRB JCL Uncertainty Assessment • Highlights show were SRB made changes to the JCL
• SRB increased uncertainty mostly in downstream areas (Assembly & Test as well as Integration and Test)
Work ElementHW Design and
DevelopmentAssembly & Test (EDU)
Assembly & Test (FU)
Big
Seve
n Subsystem #1 Medium Medium -> High High -> MediumSubsystem #2 Medium Medium -> High High -> MediumSubsystem #3 Low -> Medium Medium -> High HighSubsystem #4 Low Medium HighSubsystem #5 Low Medium MediumSubsystem #6 Medium Medium HighSubsystem #7 Medium Medium -> High Medium -> HighSubsystem #8 Medium -> High Medium MediumSubsystem #9 Low -> Medium Low -> Medium Low -> Medium
Integration & Test #1 High -> Very HighIntegration & Test #2 High
National Aeronautics and Space Administration
www.nasa.gov
SRB JCL Discrete Risk Assessment• Individually assessed Project Risks:
– Risk Register– LxC– Impact: task level and s-curve
RED = Changes since 20 day Data Drop
• Concern –> 3 Driving schedule Risks
• SRB provided 26 candidate risks:1. Potential Risks for Inclusion (12)2. Parking Lot (4)3. Already covered by uncertainty (10)
• SRB Actions:– Deleted (1), adjusted LxC (2), SRB Risks (10) * Risk impacts are assessed individually;
not correlated to the consequence score
National Aeronautics and Space Administration
www.nasa.gov
SRB JCL Assessment
SRB Uncertainty
SRB Adjusted RisksProject Risk LxC
Impact on Schedule
S-Curve - DaysRisk #1 3x3 44Risk #2 2x4 21Risk #3 2x4 12Risk #4 2x3 6Risk #5 2x3 5Risk #6 2x3 7Risk #7 3x2 4Risk #8 2x2 0Risk #9 2x3 0
Risk #10 1x3 0Risk #11 2x1 0
Additional SRB Risks
Min Most Max3x3 SRB Additional Risk #1 5 15 30 SS#12x4 SRB Additional Risk #2 10 20 40 SS#12x3 SRB Additional Risk #3 15 30 45 SS#22x3 SRB Additional Risk #4 10 20 30 SS#32x4 SRB Additional Risk #5 10 20 40 SS#42x3 SRB Additional Risk #6 10 20 30 SS#53x2 SRB Additional Risk #7 5 10 15 I&T#12x2 SRB Additional Risk #8 5 10 20 I&T#12x3 SRB Additional Risk #9 10 30 45 SS#62x3 SRB Additional Risk #10 10 20 30 SS#6
Impact (WD)AREALxC Description
Assessment: • Low Confidence
National Aeronautics and Space Administration
www.nasa.gov
Agenda
• Independent Reviews and Background
• Process and Procedures– Cost Assessment– Schedule Assessment– JCL Assessment
• Lessons Learned
National Aeronautics and Space Administration
www.nasa.gov
Probabilistic Critical PathNarrows universe of technical assessment areas an assessor should investigate. Ask: Are these PCP logical links appropriate? Example: L1-to-L1 VCTAsk: Parallelism for critical items appropriate given resource constraints, hidden interdependencies?
Discovery: The PCP is the gateway to
focused technical schedule assessment.
National Aeronautics and Space Administration
www.nasa.gov
How do we Assess Uncertainty?
1 First looked at History
What was done by other SRBs?
2 Considered CADRe
Are there any ‘standards’? References?3Decided to use the Project Distributions
Focus needs to be on the assessment not differing assumptions (distributions)!
• Uncertainty drives these models!
• How do we ensure consistency across SRBs, Programs, Projects, Mission Directorates and Centers?
National Aeronautics and Space Administration
www.nasa.gov 21
Qualitative Descriptors MatterUncertainty Buckets
• Uncertainty distributions provided by project– Assessment also used the same nomenclature– Not much difference between Low and Med
• SRB reluctance to suggest changes from “Med” to “High”
Project Schedule Distributions
LognormalParameters (Mean, St Dev)
AlternateDistribution
Name
Low Log(102,15) Low
Medium Log(105,15) Low+
High Log(115,20) Med
Very High Log(120,25) Med+
Highest Log(125,25) High
• An alternate naming system likely would’ve produced a much different assessment by SRB
Work ElementHW Design and
DevelopmentAssembly & Test (EDU)
Assembly & Test (FU)
Big
Seve
n Subsystem #1 Low+ Low+ MediumSubsystem #2 Low+ Low+ MediumSubsystem #3 Low Low+ Low+Subsystem #4 Low Low+ MediumSubsystem #5 Low Low+ Low+Subsystem #6 Low+ Low+ MediumSubsystem #7 Low+ Low+ Low+Subsystem #8 Low+ Low+ Low+Subsystem #9 Low Low Low
Integration & Test #1 MediumIntegration & Test #2 Medium
National Aeronautics and Space Administration
www.nasa.gov
Considerations• Be flexible!
– Definitely have a plan & know what you’d like to do– Nature of these reviews tend to be very fluid– Examples
• Lacked a useful MEL for parametric estimates• Additional SRB Risks post-site review
• Engage early with the SRB Chair, RM, and the SRB– Limited time at site review; pre-brief as much as possible– When caucusing with SRB, try to avoid going last!
• Be prepared to teach at each step– SRB’s need training!– Not all SRB members understand programmatic analysis
National Aeronautics and Space Administration
www.nasa.gov
Process Considerations• Where does data/models go now that we’re done?
Who gets access?– Still awaiting final instructions on what to do with the data
• Travel Controls– Funding provided to a Center-managed WBS; therefore, the actual
travel to support SRB counted against internal Center controls (causing problems for existing Center travel needs)
– Why isn’t travel held by the MD? Just provide a WBS to charge.– Some logistics still need to be ironed out
• Training!– Analysts supporting the new process need it– SRB Chairs, Review Managers and other SRB Members do too!
National Aeronautics and Space Administration
www.nasa.gov
Most Important Lesson Learned
• When Charley Hunt bribes you with a beer as he signs you up for this, FOLLOW THROUGH!– See you at the bar at tonight!