Software Measurement Activities. Software Measurement Framework SEG3202 N. El Kadri
Jan 11, 2016
Software Measurement Activities.
Software Measurement Framework
SEG3202N. El Kadri
2
Software Measurement Activities• Cost and effort estimation models and
measures • Productivity models and measures • Data collection • Quality models and measures • Reliability models • Performance evaluation and models • Structural and complexity metrics • Capability maturity assessment • Management by metrics • Evaluation of methods and tools
3
Cost and effort estimation
– managers must plan projects by predicting necessary cost and effort and assigning resources appropriately
– Doing this accurately has become one of the ‘holy grail’ searches of software engineering.
– numerous measurement-based models for software cost and effort estimation have been proposed and used.
– Examples: Boehm’s COCOMO model, Putnam’s SLIM model and Albrecht’s function points model.
4
Simple COCOMO ModelEffort Predict
Effort = a(size)b
Effort = person monthSize = predicteda,b: constants depending on type of
system:“organic”: a = 2.4 b = 1.05“semi-detached”: a = 3.0 b = 1.12“embedded”: a = 3.6 b = 1.2
5
Albrecht’s Function PointsCount the number of:
–External inputs–External outputs–External inquiries–External files–Internal files
Giving each a “weighting factor”
The Unadjusted Function Count(UFC) is the sum of all these weighted scores
To get the Adjusted Function Count(FP), multiply by a Technical Complexity Factor(TCF)
FP = UFC * TCF
UserSpellingChecker
User
Dictionary
6
Productivity models and measures
• Traditional model: simply divides size (LOC) by
effort (person-month). • Productivity model as a decomposition into
measurable attributes: – This model is a significantly more comprehensive view of
productivity than the traditional one
7
Data Collection• Effective use of measurement is dependent on
careful data collection • Ensure that measures are defined
unambiguously, that collection is consistent and complete, and that data integrity is not at risk.
• Require carefully-planned data collection, as well as thorough analysis and reporting of the results.
• Example: failure data collection 1) Time of failure 2) Time interval between failures 3) Cumulative failure up to a given time 4) Failures experienced in a time interval
8
Quality Models
• Models of quality for various views of software quality constructed in a tree-like fashion
• The tree describes the pertinent relationships between factors and their dependent criteria, so we can measure the factors in terms of the dependent criteria measures. – upper branches hold important high-level quality factors of
software products, such as reliability and usability, that we would like to quantify
– Each quality factor is composed of lower-level criteria, such as modularity and data commonality.
– The criteria are easier to understand and measure than the factors; thus, actual measures (metrics) are proposed for the criteria.
9
ISO 9126 Quality Model
Factors
Criteria
Metrics (see ISO9126-2, ISO9126-3)
10
Reliability Models• Most quality models include reliability as
one of its component factors. • software reliability modeling is
applicable during the implementation phase of software QA.
• Based on observation and record information about software failures during test or operation.
11
Reliability Models• Plot the change of failure intensity
against time.• The most famous reliability models are
the basic exponential model and logarithmic Poisson model– The basic exponential model assumes finite
failures in infinite time; – the logarithmic, Poisson model assumes
infinite failures.
• Automated tools such as CASRE are available.
12
Performance Evaluation and Models
• Performance Model includes externally-observable system performance characteristics, such as response times and completion rates.
• Performance modeling is part of the implementation and maintenance phases of software QA.
• Performance specialists also investigate: – the efficiency of algorithms as embodied in computational
and algorithmic complexity [Harel 1992] – the inherent complexity of problems measured in terms of
efficiency of an optimal solution.
13
Structural and complexity metrics• We measure structural attributes of representations of the
software which are available before the implementation:• Control- flow structure• Data- flow structure• Data structure• Information flow attributes
• Complexity metrics (1979~)– Halstead’s “Software Science” metrics– McCabe’s “Cyclomatic Complexity” metrics (McCabe
1989) - number of independent paths in execution of a program
– Influenced by• Growing acceptance of structured programming• Notions of cognitive complexity
14
McCabe’s Cyclomatic Complexity
If G is the control flowgraph of program Pand G has e edges and n nodes
v(P) = e – n + 2
v(P) is the number of linearly independentpaths in G
here, e = 16, n=13, and v(P) = 5
More simply, if d is the number of decisionnodes in G then
v(P) = d + 1
McCabe proposed: v(P) < 10 for each module P
15
Management by metrics• Estimate project elements such as cost,
schedules, and staffing profiles
• Track project results against planning estimates
• Validate the organizational models as the basis for improving future estimates
16
Measurement for Guiding Management: Example
• Assume that an organization’s goal is to decrease the error rate in delivered software while maintaining (or possibly improving) the level of productivity;
• further assume that the organization has decided to change the process by introducing the Cleanroom method.
• SEL assessed the impact of introducing the Cleanroom method.
• The results of the experiment appear to provide preliminary evidence of the expected improvement in reliability following introduction of the Cleanroom method and may also indicate an improvement in productivity.
17
Evaluation of methods and Tools
• Efficiency of methods (1991~)• Efficiency and reliability of tools• Certification test of acquired tools
and Components
18
Capability Maturity Assessment
• US Software Engineering Institute (SEI) model (1989): Grading using five-level scale.
• ISO 9001: Quality systems: models for quality assurance in design/development, production, installation and servicing (1991)
• ISO 9000-3: Guidelines for application of ISO 9001 to the development, supply and maintenance of software (1991)
19
Capability Maturity Model (CMM)
1. Initial The software process is characterized as ad hoc, and occasionally even chaotic. Few processes are defined, and success depends on individual effort.
2. Repeatable Basic project management processes are established to track cost, schedule, and functionality. The necessary process discipline is in place to repeat earlier successes on projects with similar applications.
3. Defined The software process for both management andengineering activities is documented, standardized, and integrated into a standard software process for the organization. All projects use an approved, tailored version of the organization's standard software process for developing and maintaining software.
4. Managed Detailed measures of the software process and product quality are collected. Both the software process and products are quantitatively understood and controlled.
5. Optimizing Continuous process improvement is enabled by quantitative feedback from the process and from piloting innovative ideas and technologies.
Disciplinedprocess
Standard, consistentprocess
Predictableprocess
Continuously improvingprocess
20
Software Measurement Program
• Measurement is the mechanism to provide feedback on software quality.
• A measurement program without a clear purpose will result in frustration, waste, annoyance, and confusion.
• To be successful, a measurement program must be viewed as one tool in the quest for the improved engineering of software.
21
SE Standards • ISO/IEC 9126 Software product evaluation:
Quality characteristics and guidelines for their use
• ISO/IEC 15939:2002 Software Measurement Process
• ISO 9000 Standards are used to regulate internal quality and to assure quality of suppliers. Measurement is part of ISO 9000
• IEEE 1061: Software Quality Metrics Methodology
• IEEE 1045: Software Productivity Metrics
22
Clarification: Metrics v.s. Measures v.s. Measurements •Metrics are commonly accepted
scales that define measurable attributes of entities, their units and their scopes.
•Measure is a relation between an attribute and a measurement scale.
• In the literature, measurements, measures, metrics are used as synonymous
23
Rigorous Measurement Framework• Measurement = data collection +
context• Data collection:
– Why you are collecting the data– How you plan to use the data– Purpose or destination for collecting the
data (f.i., improving quality of your software from some perspective)
• Trade-off between costs and benefits
24
How to Build Valid Measurement Context?
• Points of view on software development:– StrategicStrategic:: long-term performance of the
organization– TacticalTactical: : short-term performance of an individual
process– Technical:Technical: details of products and processes that
influence the development processes and products
• Classes of software development objects– ProductsProducts– ProcessesProcesses– ResourcesResources
Measurement Context = selected points of view + selected object(s)
25
Views of Measurement: Strategic View
• Organization’s goals are stated in measurable terms
• Measures of products, projects, and resources are summarized as means or medians, with some indication of variability– Unit cost (labor hours / size)– Defect rates (delivered defects / size)– Cycle time (project days / size)
• Strategic View tracks trends of these summary statistics.
• Strategic data is used to determine if and how well those goals are being met
• Primary user of strategic measurement data: strategic manager
26
Views of Measurement: Tactical View
• Concerned with performance of individual project
• Measurement data is used to– compare actual results to target (estimated or
planned) results. Any variances are noted and investigated.
• Defect discovery rate during inspection or testing activities
– predict values of certain indirect project measures
• Using project size to predict cost and schedule
• Primary user of tactical measurement data: project manager
27
Views of Measurement: Tactical View Project manager uses tactical measurement data for:
28
Views of Measurement: Technical View
• Physically, all measurement takes place at the technical level.
• All measures used at the strategic and tactical levels are built from fundamental technical measures – Strategic and tactical users of measurement data depend
on technical users to supply the data
• Technical measures are focused upon a set of internal attributes of a single product or process, highly dependent on the technology in the product
• Primary user of technical measurement data: software engineer
29
Views of Measurement: Technical View
30
Objects of Measurement
• First obligation of measurement effort is to identify those objects which are to be measured– Processes, Products, Resources
• We measure attributes of those objects– Internal: measured purely in terms of the
process, project, product or resource itself– External: can be measured only with
respect to how the process, project, product or resource relates to its environment
31
Objects of Measurement: Process
• Processes are measured by comparing instance measurements to each other over time– Direct Internal process measures:
32
Objects of Measurement: Process
– Indirect Internal process measures
– External Process Measures• Productivity – the unit of product produced per unit of
input• Stability of the process • Variation – the extent to which instances of the
process differ from each other
33
Objects of Measurement: Resources
• resources are those objects that serve as input to the processes:– People, tools, materials, methods,
time, money, training– Internal attributes measures:
• Cost, capability, constraints on use
– External attributes measures: • Performance, productivity
34
Software Measurement Framework