sm i Experiences/Observations in Developing COCOTS and Early COCOTS Betsy Clark Brad (baggage handler/coffee- getter) Clark Chris Abts 20th International Forum on COCOMO and Software Cost Modeling October 2005
Dec 21, 2015
smi
Experiences/Observations in Developing COCOTS and Early COCOTS
Betsy Clark
Brad (baggage handler/coffee-getter) Clark
Chris Abts
20th International Forum on COCOMO
and Software Cost Modeling
October 2005
2smi
Outline
Data Collection Challenges
Early COCOTS Model – An Update
3smi
Data Collection Challenges
Why is data so hard to get?– No immediate payback for data providers (takes time and
effort)– Fear of airing “dirty laundry”
Data collector is about as popular as a telemarketer
Projects don’t track effort by Assessment – Tailoring – Glue Code
– Effort data must be reconstructed– Quality of data is highly dependent on knowledge of person
being interviewed
4smi
Experiences in Obtaining Data
Started by sending COCOTS data-collection survey and asking people to fill out
– Result: nothing!– Length of the survey may have discouraged people
Changed our approach to meet face to face– Four-hour interviews– This approach has worked reasonably well BUT…
5smi
Difficulty Obtaining Complete Data
…BUT critical data was occasionally omitted• Effort (for assessment, tailoring, or glue code)
• Glue code size
“Fred will get back to you with that number” but Fred never does
– No leverage after the fact to make Fred do that
Lesson learned for future data gathering– Send out one-page sheet in advance containing critical data
items– Person can prepare without being overwhelmed by a lengthy
form
6smi
Data Collection Challenges
Almost every project had a new war story to tell that impacted the amount of effort spent
One consistent thread is the lack of control resulting from the use of COTS components
“There may be such a thing as too much COTS – the more COTS you have, the less control you have”
7smi
Lack of Control Impacts Effort
Just a few of the areas that are out of a project’s control- Vendors going out of business - Vendors not performing as promised- Products not working as advertised
“The vendor folded half-way through development and we had to start over.”
“What leverage do I have with a vendor to ensure they deliver on time?”
“Very few components worked as advertised.”
“We spent lots of effort on workarounds because of deficiencies [in product X].”
8smi
Lack of Control Impacts Effort
A few more areas that are out of a project’s control are- Dependence on vendor to fix errors - Evolution of a component (the what and when of
new releases)
“If you find a bug in your custom code, you can fix it yourself. The vendor may not be able to duplicate a problem we find here. If not, they have to come to our site.”
“Even when we don’t change a version, there is a lot of analysis required. It can be difficult to verify implications with a black box.”
“Vendors are constantly driven to add more functionality which puts more demand on hardware.”
9smi
Lack of Control Impacts Effort
Yet another area- Lack of user acceptance because vendor’s
interface isn’t what they’re used to
“Each site is its own fiefdom. It like a 100-headed hydra trying to make same decision. No one can say yes – they all can say no.”
10smi
Lack of Control - Summary
In going the COTS route, a project is faced with a lot of potential “gotcha’s” that can have a major impact on effort
Ye Yang is identifying risks in developing COTS-based systems
11smi
Important Questions
Are we modeling the right activities?– Assessment, tailoring, glue code are important– Initial data collection efforts did not include several major
activities• Business Process Reengineering (BPR)• Impact analysis and verification testing for new releases or
products• Developer training (especially for tailoring)• User training• Data conversion
Do we have the right cost drivers?– Yes, but some are difficult for model users to estimate early
on– Major impetus leading to Early COCOTS
12smi
Outline
Data Collection Challenges
Early COCOTS Model
13smi
What is Early COCOTS?
35,000 foot view to be used early in the lifecycle for:– Rough Order-of-Magnitude (ROM) estimates – Range of estimates
Simplified model– Information known early in the lifecycle is limited
14smi
Early COCOTS Effort Estimation
GlueCodeTailoringAssessmentCOTS PMPMPMPM
Table lookup by COTS product Linear model or
distribution by COTS class
Productivity by COTS product
15smi
What Model Users Know Early On
System Sizing Parameters– Number of users– Number of sites– Amount of data (legacy and new)
• Number and age of databases to be converted
– Amount of legacy code to be reused• Totally new systems are rare
– Number of interfacing systems
Requirements– Functional (high level)– Performance– Security
Architecture (Solution alternatives)
Implementation Strategy
16smi
Classes of COTS Products
Application– graphic information systems– back office retail– telemetry analysis– telemetry processing– financial packages– network managers
Infrastructure– databases– disk arrays– communication
protocols/packages– middleware– operating systems– Network monitors– device drivers
Tools– configuration mgmt/build tools– data conversion packages– compilers– emulators– engineering tools (req’ts mgmt,
design)– software process tools– GUIs/GUI builders– problem management– report generators– word processors
17smi
Assessment Effort vs. Number of COTS Products
COTS Product Assessment
0
20
40
60
80
100
120
140
160
0 5 10 15 20 25 30 35 40 45
Number of COTS Products
Ac
tua
l Eff
ort
COTS Product Assessment
0
20
40
60
80
100
120
140
160
0 5 10 15 20 25 30 35 40 45
Number of COTS Products
Ac
tua
l Eff
ort
There does not appear to be a correlation between number of products and total time spent in assessing them.
There does not appear to be a correlation between number of products and total time spent in assessing them.
18smi
Assessment Effort Estimation
Need to explain the variation in the amount of effort spent assessing a COTS product between different COTS-intensive application developments
– Must be known early in the life cycle
“Uncertainty” driver– Created from the COCOTS data– The Uncertainty driver rates the number of unknowns that
must be resolved to ascertain the fitness of a COTS product for use in the overall system.
– Applies to all classes of COTS products
19smi
Degree of Uncertainty -1
Low– Select from a list of pre-certified products– Choice is dictated (by hardware, by other software or by
organizational policy)– Already using a product which will be used in this project
Medium– There are multiple products but a detailed assessment is not
required. Assessment will be a simple exercising of the product and/or a paper and pencil evaluation
Large– One or two products get very detailed assessment and the
other products choices were certain, e.g. once the operating system was chosen the other products were selected as well
20smi
Degree of Uncertainty -2Very Large
– There are a fair number of COTS products with very high level of service requirements combined with large amounts of custom code. There is a lot of uncertainty and risk around those products. A lot of effort is spent on making sure those products work
– Verify service level agreements such as performance, reliability, availability, fault tolerance, security, interoperability, etc.
– Quadruple redundancy– Seven 9’s of reliability (99.99999)– Through prototyping to assess key criteria
Extra Large– Many different groups of users: end-to-end detailed scenarios
(entire work flows and data flows) required to assess suitability– Example: Government Financial package suite used for multiple
government agencies
21smi
Assessment Effort vs Degree of Product Uncertainty
Assessment
0
20
40
60
80
100
120
140
160
180
0 5 10 15 20 25
Uncertainity
Ac
tua
l Eff
ort
S M L VL XL
Assessment
0
20
40
60
80
100
120
140
160
180
0 5 10 15 20 25
Uncertainity
Ac
tua
l Eff
ort
S M L VL XL
22smi
Assessment InputThe effort required for Assessment is by the Degree of Uncertainty.
Example:The Degree of Uncertainty was judged as Large (using rating descriptions)Low Estimate = 8.27 PM * 2.27 = 19 PMMean Estimate = 8.27 PM * 2.73 = 23 PMHigh Estimate = 8.27 PM * 3.28 = 27 PM
Small Medium LargeVery Large
Extra Large
Select the Degree of Uncertainty
0.310.370.44
0.791.001.27
2.272.733.28
5.167.44
10.73
11.6520.1434.83
Estimated Assessment Effort = 8.27 PM * Uncertainty Rating Range
Lower 80% CLMean
Upper 80% CL
23smi
Tailoring Effort EstimationNeed to explain the variation in the amount of effort spent tailoring a COTS product between different COTS-intensive application developments Must be known early in the life cycle
Application-type COTS products– Number of User Profiles: roles and permissions
– Different user profiles create the need for different scripts, screens, and reports.
Infrastructure-type COTS products– Tailoring effort appears relatively constant
– This type of tailoring usually consists of installation and setting parameters
Tool-type COTS products– These don’t appear to require tailoring, training is more of a cost
driver for these products
24smi
Tailoring Effort Applications
Tailoring Effort for Applications
PM = 49.967x
R2 = 0.8928
0
50
100
150
200
250
300
0 1 2 3 4 5 6
Number of User Profiles
Effo
rt (
PM
)
Lower 80% CL = 43.6 PMMean = 57.1 PM
Upper 80% CL = 70.6 PM
Lower 80% CL = 43.6 PMMean = 57.1 PM
Upper 80% CL = 70.6 PM
25smi
Tailoring Effort for Infrastructure
Tailoring Effort Distribution for Infrastructure
0
1
2
3
4
5
6
7
1 2 3 4 5 6 7 8
Tailoring Effort
Fre
quen
cy
Lower 80% CL = 2.69 PMMean = 3.75 PM
Upper 80% CL = 4.81 PM
Lower 80% CL = 2.69 PMMean = 3.75 PM
Upper 80% CL = 4.81 PM
26smi
Glue Code Effort -1
Need to explain the variation in the amount of effort spent constructing glue code for COTS products between different COTS-intensive application developments
– Must be known early in the life cycle
Glue Code effort is based on observed productivities in the data
Quote– “It is very difficult to write glue code because there are so many
constraints”
Effect: productivities are lower than for coding custom software
COCOTS data also shows that the Required System/ Subsystem Reliability is inversely correlated to glue code productivities
27smi
Glue Code Effort -2
Constraints on System/Subsystem Reliability (ACREL)– How severe are the overall reliability constraints on the
system or subsystem into which the COTS components was/is being integrated? What are the potential consequences if the components fail to perform as required in any given time frame?
Low Nominal High Very High
Threat is low; if a failure occurs losses
are easily recoverable (e.g.,
document publishing).
Threat is moderate; if a failure occurs losses are fairly
easily recoverable (e.g., support
systems).
Threat is high; if a failure occurs the risk is to mission critical
requirements.
Threat is very high; if a failure occurs the
risk is to safety critical requirements.
28smi
7
0
2
0
4
1
0
1
0
1
0
1
2
3
4
5
6
7
8
100 200 300 400 500 600 800 900 1000 More
SLOC/PM
Fre
quen
cy
Glue Code Productivities
Reliability = VHReliability = VH
Reliability = NReliability = N
Reliability = LReliability = L
29smi
Glue Code
Cots Products
GC Effort(staff
months) SLOC SLOC/PMRequired Reliability
Operating System 6 100 17 H
Communication Product 1 30 30 L
Database 85 5000 59 H
Network Manager 2 150 75 VH
Operating System 36 3,000 83 VH
Graphical User Interface 12 1,000 83 VH
Network Manager 24 2,000 83 VH
Telemetry Product 12 2800 233 VH
Network Manager 6 1400 233 VH
Report Generator 12 5,000 417 H
Graphical User Interface 60 30,000 500 H
Communication Product 6 3,000 500 H
Graphical User Interface 100 50,000 500 H
Graphical User Interface 84 50,000 595 L
Database 12 10,000 833 H
Graphical User Interface 12 20,000 1667 N
30smi
Future Work
Cost Model Requests
“You need to have one integrated model – not COCOMO + COCOTS”
“We need one model – not two”
“Need to know how to cost the entire life cycle”
“COTS/GOTS is high risk because we are dependent on someone else... There needs to be a process to help people evaluate risk.”
“Our biggest driver is testing – not being captured now by COCOTS”
“Does model account for effort required to evaluate vendor patches/releases?”
31smi
Contact Information
Betsy Clark
Brad Clark
(703) 754-0115
Betsy Clark
Brad Clark
(703) 754-0115