Top Banner
Panning for Gold in Historical Operations Records Kevin Ingoldsby Booz Allen Hamilton Cape Canaveral, Florida [email protected]
22
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: K ingoldsby

Panning for Gold in Historical Operations

Records

Kevin IngoldsbyBooz Allen HamiltonCape Canaveral, [email protected]

Page 2: K ingoldsby

The Data Miners Challenge

• Parametric cost estimation models are only as valid as input data which founds them

• Unfortunately for cost estimators and modelers, very few operational programs take the time and expense to record operations data in formats that easily facilitate future modeling and analysis

• But even in the mountains of seemingly unrelated operations records, gold knowledge dust and occasional nuggets can be found

Page 3: K ingoldsby

Challenge Details

• Operations Phase modeling of space systems relies heavily on historical benchmark data to both anchor parametric analysis methods and as validation data to evaluate modeling tool outputs

• Unfortunately for most modelers there are often limitations in the historical records:– Available recorded data is often riddled with gaps – Data is inconsistently recorded over the program life

(inadequate data breadth) – Data is recorded with different rules within lower-level

program elements (inconsistent data depth)• Sometimes, the only data deemed valid are singular milestones such as

hardware delivery, rollout to launch pad, final launch date, mission event duration, etc.

Page 4: K ingoldsby

Sources of Data Challenge

• The reasons for lack of easily useable data are many:– Operations budgets are typically very tight with

technical problems during development consuming margins and eating into operations phase allocations

– For missions with very narrow planetary launch windows, the pressure to get the mission off the ground on-time limits the attention spent of recording more than the barest needed information for milestone decision makers.

– Operations business support systems database schemas are driven by the operations management need, typically the implementation of work planning and closed-loop accounting for operations requirements.

Page 5: K ingoldsby

Hope for the Prospector

• Useable and valuable operations performance information may be lurking in records that were created for different purposes – The presenter has applied techniques to extract operational metric

data from NASA Space Shuttle and other launch vehicle operations records

– Products of these data mining efforts have informed development of several cost and operations modeling tools across the agency

• Goals of this presentation: – Share examples of data extraction efforts– Show how the data was applied– Identify some prospective mother-loads

that have yet to be prospected

Page 6: K ingoldsby

Spaceflight Operations Modeling

• Most cost modeling tool development focus has been on design, development, test and production phases of the life-cycle– These phases typically are the largest investment for a

transportation system– Budgeting process tends to be an annual exercise, this near-term

scrutiny tends to obscure the assessment of recurring costs– Some DDTE&C models provide predictions of operations

infrastructure development cost, but not much fidelity of recurring operations burdens

• Predicting the recurring costs and performance of the operational phase of spaceflight systems motivated the studies that will be discussed in following slides

Page 7: K ingoldsby

Operational Performance Metrics

• Investment Costs– Infrastructure– Equipment– Process Plans & Specifications

• Fixed Costs– Core workforce– Core resources

• Variable Costs– Direct mission labor– Mission hardware & consumables

• Throughput– Process cycle time– Mission payload/cargo/passenger delivery rates

Page 8: K ingoldsby

Case Study: Vision Spaceport Project (VSP)

• Joint Sponsored Research Agreement involving KSC, ARC, Boeing, Lockheed Martin, UCF, CCT– Follow-on effort from the Highly-

Reusable Space Transportation Program (HRST)

– Project was conducted from 1998-2002

• Project Goal:– Develop a modeling tool for

prediction of space launch operations costs and performance

– Focus of the modeling effort was the quest for Orders of Magnitude improvement over then-current systems (STS, Titan, Atlas II, Delta II, Pegasus)

Page 9: K ingoldsby

VSP: Benchmarking

• Study team developed a functional model of spaceport operations to organize the analysis and modeling efforts– Model functions helped to organize the collection of

benchmark program/vehicle data– Functions helped to communicate the varying infrastructure

and operational needs of different launch system concepts• Each “Module” of the VSP functional model was

documented in the benchmarking effort– Constituent sub functions described– Current state examples identified– Concepts identified for orders of magnitude improvement

Page 10: K ingoldsby

VSP: Data Collection & Analysis

• With a 5-6 order of magnitude scale, a wide range of operational data was investigated– 1994 Access to Space study provided much information for STS operations

• Bottoms-up assessment was broken down by vehicle element (Orbiter, ET, SRB, Facility) VSP functional module and cost category

– Historical launch vehicle data helped to expand the set• Range of vehicles from contemporary ELVs to early launch vehicles of the 50’s• Sources included photographs, schedules, narratives, budget data, technical reports

– In most cases, data would be found only for a subset of modules and cost categories• For example: Information on the launch pad crew headcount, turnaround times for X-15

vehicle flight attempts, cost for construction of the Saturn V launch complex, etc.

– Sources were captured and documented in spreadsheets• Recorded by Vehicle configuration, Function and Cost Category• Each vehicle that provided metric data points was scored for operability using the model

assessment algorithms• Resulting scores used to plot data points and calibrate model output performance curves

Page 11: K ingoldsby

VSP: Nuggets

• STS / Access to Space Study– Good breakdown of labor and material costs attributable

to vehicle elements and most module functions• X-15 program flight logs– Extensive information on turnaround and depot

operations cycle time from 150 missions flown by 3-vehicle fleet

• WSMR research flight logs 1946-58– Provided assembly and cargo integration cycle time and

crew size data points for suborbital launch vehicles

Page 12: K ingoldsby

Case Study: Reliability Modeling (RMS)

• LaRC Vehicle Analysis Branch was extending USAF squadron logistics model to predict operational performance of Reusable Launch Vehicles– Model based on historical aircraft maintenance

operations records of USAF and USN– Sought assistance at KSC in developing similar metric

data from Space Shuttle operations history– Initial study assessed missions from 1992-98– Follow-on effort added missions from 1999-2002

Page 13: K ingoldsby

RMS: Data Needs

• Model required RM&S metrics by subsystem– Cycle Time metrics (MTBMA, MTTR, etc.)– Event frequency/probability metrics (Parts Removal &

Replacement frequency, scrap rates, etc)• Metric data for launch vehicle identified for

subsystem categories similar to aircraft– Propulsion– Avionics– Hydraulics– Structures

Page 14: K ingoldsby

RMS: Data Mining Approach

• Obtained mission records from several SPDMS databases:– PRACA – Provided unplanned maintenance action

data– AGOSS – Provided planned work data– SFDC – Provided some direct labor data

• Surveyed SME community to identify typical vehicle powered operations by subsystem– Needed for failure rate calculations

Page 15: K ingoldsby

RMS: Data Analysis

• Challenge: No single STS data system recorded all the parameters needed to generate desired metrics– Operational records in dissimilar systems had

some common identifiers (WAD#)– By use of a relational database (MS Access) the

interdependencies between the available data sets were used to synthesize the metric data

Page 16: K ingoldsby

RMS: Nuggets & Fools Gold

• Initial study produced useable data for model• Revisit of study to incorporate additional 3 years

flights found discontinuity in numbers– Number of Problem Reports dropped by order of

magnitude at STS-xxx– Cause researched – Operations contract award fee

metrics changed• Fee based on number of PRs – multiple items per PR were

now being recorded to depress the metric• Required update to database schema to “count the items”

Page 17: K ingoldsby

Case Study: STS Design Root Cause Analysis (RCA)

• Questions to be answered:– “Why does it take so long to

process a vehicle for launch?”– “Why does it cost so much to

operate the STS systems?”

Page 18: K ingoldsby

RCA: Source Data

• Study team built upon prior VSP and RMS work– Used VSP Functional Model of Spaceport Operations– Incorporated mission reliability analysis data from RMS

study– Study focused on a year of STS operations

(1997 - 8 missions flown)• Dug deeper into the STS operations processes at KSC

– Obtained template and as-run scheduling system data (ARTEMIS) for the 8 STS missions conducted

– Engaged KSC engineering community in identification of system and subsystem(s) driving each individual operational task

Page 19: K ingoldsby

RCA: Data Mining & Analysis

• Complexity of source data required programing support to produce useable database records

• Interactive relational database forms used in working sessions with SMEs to capture system / subsystem task knowledge– Live sessions focused on single mission flow (STS-81)– Information from that flow batch-processed against other 7 mission data

sets• Unmatched items from batch processing were reassessed with SME’s via e-mail

file exchange• Flow-unique conditions were identified and documented (OMDP, Roll-Back, etc)

• Limitations:– Majority of operational performance data only provided task durations– Study focused on direct vehicle operations, indirect operations

assessment limited to only a few subsystems

Page 20: K ingoldsby

RCA: Nuggets

• Study provided a detailed window into relationships between subsystem design trade decisions and the resulting recurring cost/cycle time performance metrics at mature state of system– Study report published as a NASA

Technical Manual (TP—2005–211519)– http://ntrs.nasa.gov/archive/nasa/casi.

ntrs.nasa.gov/20050172128_2005171687.pdf

• Highlights:– Roughly 40% of recorded operations

were Unplanned Work– Propulsion and Thermal Protection

systems drove majority of processing tasks

Page 21: K ingoldsby

Virgin Ground?

• With the conclusion of the STS program, knowledge from it remains untapped– Experts are departing for other work– Records are being archived – some veins of insight could be

lost if not prospected soon• ISS assembly & operations– In-flight crew operations and maintenance metrics– Ground support & logistics metrics

• Planetary missions– Mars missions longevity could provide wealth of ground

support operations planning & control metrics

Page 22: K ingoldsby

Summary

• Useful operational performance metric information is often hidden or buried in seemingly unrelated data sources

• Keys to unlocking the hidden knowledge nuggets include:– Establishing a framework model for data classification– Identifying categories of performance data sought– Subject Matter Experts to help sift the fools gold and tailings from the

true valid data– Relational database tools to filter and aggregate the data as it is

accumulated– Openness to deductive reasoning – finding the implications, patterns

and especially gaps in the raw data– Curiosity, optimism and the patience to swirl the pan repeatedly to

capture the fine dust along with the obvious nuggets