SENG 521 SENG 521 Software Reliability & Software Reliability & Software Reliability & Software Reliability & Software Quality Software Quality Chapter Chapter 14: SRE Deployment 14: SRE Deployment D t t f El ti l&C t E i i Ui it fC l Department of Electrical & Computer Engineering, University of Calgary B.H. Far ([email protected]) http://www.enel.ucalgary.ca/People/far/Lectures/SENG521 [email protected]1 Contents Contents Q li i f d l Quality in software development process Software Quality System (SQS); Software Quality Assurance (SQA) and Software Reliability Engineering (SRE) Quality, test and data plans Roles and responsibilities Roles and responsibilities Sample quality and test plan B i f SRE Best practices of SRE [email protected]2 Quality in Software Quality in Software Development Process Development Process Q H t i ld lit i th P ? Architectural analysis Quality attributes Software Reliability Software Quality Q. How to include quality concerns in the Process? Quality attributes Method: ATAM, CBAM, etc. Engineering (SRE) Assurance (SQA) Requirement & Requirement & Architecture Architecture Design & Design & Implementation Implementation Test & Release Test & Release Maintenance Maintenance Software Quality Assessment Method: RAM etc [email protected]3 Method: RAM, etc. Section Section 11 Software Quality System (SQS) Software Quality System (SQS) and Software Quality Assurance and Software Quality Assurance (SQA) programs (SQA) programs (SQA) programs (SQA) programs [email protected]4 What is Reliable Software? What is Reliable Software? R li bl ft d t th th t tl d Reliable software products are those that run correctly and consistently, have fewer remaining defects, handle abnormal situation properly, and need less installation effort The remaining defects should not affect the normal behaviour and the use of the software; they will not do any d t ti d t t dit h d ft destructive damages to system and its hardware or software environment; and rarely are evident to the users Developing reliable software requires: Developing reliable software requires: Establishing Software Quality System (SQS) and Software Quality Assurance (SQA) programs Establishing Software Reliability Engineering (SRE) process [email protected]5 Software Quality System (SQS) Software Quality System (SQS) G l G l Goals: Goals: Building quality it th ft What we have covered into the software from the beginning beginning Keeping and tracking quality in tracking quality in the software throughout the throughout the software life cycle Technology [email protected]6 John W. Horch: Practical Guide to Software Quality Management
13
Embed
Software QQyy (Q)uality System (SQS) and Software Quality ...people.ucalgary.ca/~far/Lectures/SENG521/PDF/SENG... · Assurance (SQA) and Software Reliability Engineering (SRE) Quality,
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
What is Reliable Software?What is Reliable Software?R li bl ft d t th th t tl d Reliable software products are those that run correctly and consistently, have fewer remaining defects, handle abnormal situation properly, and need less installation effortp p y,
The remaining defects should not affect the normal behaviour and the use of the software; they will not do any d t ti d t t d it h d ftdestructive damages to system and its hardware or software environment; and rarely are evident to the users
Developing reliable software requires: Developing reliable software requires: Establishing Software Quality System (SQS) and
John W. Horch: Practical Guide to Software Quality Management
SQS ConcernsSQS ConcernsS ft lit Software quality management is the discipline thatdiscipline that maximizes the probability that aprobability that a software system will conform to its requirements, as those requirements are
John W. Horch: Practical Guide to Software Quality Management
Software Quality Assurance (SQA)Software Quality Assurance (SQA)S ft lit A (SQA) i l d d Software quality Assurance (SQA) is a planned and systematic approach to ensure that both software process and software product conform to the established standards, p ,processes, and procedures.
The goals of SQA are to improve software quality by it i b th ft d th d l t tmonitoring both software and the development process to
ensure full compliance with the established standards and procedures. p
Steps to establish an SQA program Get the top management’s agreement on its goal and support. Identify SQA issues, write SQA plan, establish standards and SQA
functions, implement the SQA plan and evaluate SQA program.
SRE: Management ConcernsSRE: Management ConcernsP i d ifi i f ’ l d Perception and specification of a customer’s real needs.
Translation of specification into a conforming design. Maintaining conformity throughout the development
processes. Product and sub-product demonstrations
which provide convincing indications of d d j iproduct and project meet requirements.
Ensuring that the tests and demonstrations are designed and controlled so as to beare designed and controlled, so as to be both achievable and manageable.
Roles & Responsibilities /1Roles & Responsibilities /1T t C di t (M )T t C di t (M ) Test Coordinator (Manager):Test Coordinator (Manager):Test coordinator is expected to ensure that every specific statement of intent in the product requirement, specification and design, is matched byintent in the product requirement, specification and design, is matched by a well designed (cost-effective, convincing, self-reporting, etc.) test, measurement or demonstration.
D C di (M )D C di (M ) Data Coordinator (Manager) :Data Coordinator (Manager) :Data coordinator ensures that the physical and administrative structures for data collection exist and are documented in the quality plan, receivesfor data collection exist and are documented in the quality plan, receives and validates the data during development, and through analysis and communication ensures that the meaning of the information is known to all, in time, for effective application.
Roles & Responsibilities /2Roles & Responsibilities /2C t UC t U Customer or User:Customer or User: Actively encouraging the making and following of detailed
quality plans for the products and projects. Requiring access to previous quality plans and their
recorded outcomes before accepting the figures and methods quoted in the new plan.
Enquiring into the sources and validity of synthetics and formulae used in estimating and planning.
Appointing appropriate personnel to provide authoritative Appointing appropriate personnel to provide authoritative responses to queries from the developer and a managed interface to the developer.
Receiving and reviewing reports of significant audits Receiving and reviewing reports of significant audits, reviews, tests and demonstrations.
Making any queries and objections in detail and in writing, at the earliest possible time
Quality Plans /1Quality Plans /1 The most promising mechanisms The most promising mechanisms
for gaining and improving predictability and controllability of software qualities are qualityof software qualities are quality plan and its subsidiary documents, including test plans and data ( t) l
TestTestPlanPlan
(measurement) plans. The creation of the quality plan
can be instrumental in raising
QualityQualityPlanPlan
gproject effectiveness and in preventing expensive and time-consuming misunderstandings
DataDataPlanPlang g
during the project, and at release/acceptance time.
Quality Plan /2Quality Plan /2Q li l d li d id id li Quality plan and quality record, provide guidelines for carrying out and controlling the followings: Requirement and specification management Development processes Documentation management Design evaluation Product testing Data collection and interpretation SRE related
Quality Plan /3Quality Plan /3Q li l i h ld b d h li i i Quality planning should be made at the very earliest point in a project, preferably before a final decision is made on feasibility and before a software development contract isfeasibility, and before a software development contract is signed.
Quality plan should be devised and agreed between all the y p gconcerned parties: senior management, software development management (both d i i t ti d t h i l) ftadministrative and technical), software
development team, customers, and any involved general support functions suchinvolved general support functions such as resource management and company-wide quality management.
Data (Measurement) PlanData (Measurement) Planh ( ) ib The data (measurement) plan prescribes: What should be measured and recorded during a project; How it should be checked and collated; How it should be interpreted and applied.
Data may be collected in several ways, within the specific project and beyond it.
Ideally, there should be a higher level of data collection and application into ppwhich project data is fed.
Test Plan /1Test Plan /1Th f t t l i t th t ll t ti ti iti The purpose of test plan is to ensure that all testing activities (including those used for controlling the process of development, and in indicating the progress of the project) p , g p g p j )are expected, are manageable and are managed.
Test plans are created as a subsection or as an associated d t f th lit ldocument of the quality plan.
Test plans become progressively more detailed and expanded during a projectexpanded during a project.
Each test plan defines its own objectives and scope, and the means and methods by which p , ythe objectives are expected to be met.
Test Plan /2Test Plan /2F th ft d t th t t l i ll t i t d b For the software product, the test plan is usually restricted by the scope of the test: certification, feature and load test.
The plan predicts the resources and means required to reach The plan predicts the resources and means required to reach the required levels of assurance about the end products, and the scheduling of all testing, measuring and demonstration
ti itiactivities. Tests, measurements and demonstrations are used to
establish that the software product satisfies the requirementsestablish that the software product satisfies the requirements document, and that each process during a development is carried out correctly and results in acceptable outcomes.
Effective CoordinationEffective CoordinationC di i l l l d Coordination among quality plan, test plans and data plans is necessary.
Effective coordination can only be introduced and practiced if the environment and supporting structures exist.
To make the coordination work, all those involved ,must be prepared to question and evaluate every aspect of what they are doing, and must be ready to p y g, yboth give and to accept suggestions and information outside their normal field of interest and authority.
Effective Coordination /2Effective Coordination /2S i l di tiS i l di ti Serial coordinationSerial coordination Serial coordination means application of
i f i f h i linformation from one phase or process in a later and different phase or process.
Parallel coordinationParallel coordination Parallel coordination is the application of
i f i f i finformation from one instance of anactivity or process to other instances of the same processinstances of the same process, whether in the same project or in others in progress
Coordination of Data Plans /1Coordination of Data Plans /1
C di ti ( h i ) d t l b t Coordinating (or sharing) data plans between projects A collection of data which covers more than one
project and several different development routes id t iti tprovides opportunities to
Compare the means of production (and thus supports rational choices between them) as well as allowingrational choices between them), as well as allowing
Selection of standard expectations for performance which can be used in project planning and projectwhich can be used in project planning and project control.
Coordination of Data Plans /2Coordination of Data Plans /2
Coordinating (or sharing) data between organizationsg Providing a wider base for evaluation. Leading to a more general view of what is Leading to a more general view of what is
comprised in “good practice”. L di t l i f ti Leading to a more general view of connections between working methods and their results.
Coordination of Data Plans /3Coordination of Data Plans /3C di ti f d t l i tit d Coordination of data plans improves quantity and quality of data in the sense of:
Estimation and re estimation of projects both in Estimation and re-estimation of projects, both in administrative and technical terms;
Management of the project, its products, processes andManagement of the project, its products, processes and resources;
Selective re-use of methods and procedures to reduce reinvention, and to benefit by experience;
Harmonization of goals and measurements across projects;projects;
Rationalization of the provision of support tools and services;
Coordination of Test Plans /1Coordination of Test Plans /1U i th t d l i f Uses in the management and planning of resources and environments. R l f t t l i i th li bilit d Role of test plans in ensuring the applicability and testability of the design and the code.T t l d id f th i Test plans used as a guide for those managing testing. T t l d i t t lit d Test plans used as an input to quality assurance and quality control processes.U f t t lt t d id i t Use of test results to decide on an appropriate course of action following a testing activity.
Practice of SRE /1Practice of SRE /1Th i f SRE id h f i The practice of SRE provides the software engineer or manager the means to predict, estimate, and measure the rate of failure occurrences in softwareof failure occurrences in software.
Using SRE in the context of Software Engineering, one can: Analyze, manage, and improve the reliability of software products. Analyze, manage, and improve the reliability of software products. Balance customer needs for competitive price, timely delivery, and a
reliable product.
fully
!
Determine when the software is good enough to release to customers, minimizing the risks of releasing software with serious problems.
Avoid excessive time to market due to overtesting.Hop
ef
Avoid excessive time to market due to overtesting.
Practice of SRE /2Practice of SRE /2The practice of SRE may be summarized in six steps:The practice of SRE may be summarized in six steps:
1) Quantify product usage by specifying how frequently customers will use various features and how frequently various environmental conditions that influence processing will occurinfluence processing will occur.
2) Define quality quantitatively with the customers by defining failures and failure severities and by specifying the balance among the key quality objectives of reliability, delivery date, and cost.j y, y ,
3) Employ product usage data and quality objectives to guide design and implementation of the product and to manage resources to maximize productivity (i.e., customer satisfaction per unit cost).
4) Measure reliability of reused software and acquired software components as an acceptance requirement.
5) Track reliability and use this information to guide product release.6) Monitor reliability in field operation and use results to guide new feature
introduction, as well as product and process improvement.
Design and implementation phase:Design and implementation phase: Allocate reliability among components, acquired y g p , q
software, hardware and other systems Engineer to meet reliability objectives Engineer to meet reliability objectives Focus resources based on operational profile
M li bilit f i d ft Measure reliability of acquired software, hardware and other systems, i.e., certification test
Post delivery and maintenance:Post delivery and maintenance: Project post-release staff needs j p Monitor field reliability vs. objectives Track customer satisfaction with reliability Track customer satisfaction with reliability Time new feature introduction by monitoring
li bilitreliability Guide product and process improvement with
Feasibility PhaseFeasibility PhaseA i i 1A i i 1 D fi d l if f il Activity 1:Activity 1: Define and classify failures Define failure from customer’s perspective
G id tifi d f il i t f it l f Group identified failures into a group of severity classes from customer’s perspective
Usually 3-4 classes are sufficient Usually 3 4 classes are sufficient
Activity 2:Activity 2: Identify customer reliability needs What is the level of reliability that the customer needs? What is the level of reliability that the customer needs? Who are the rival companies and what are rival products and what is
their reliability?
Activity 3:Activity 3: Determine operational profile Based on the tasks performed and the environmental factors
Design PhaseDesign PhaseA ti it 6A ti it 6 All t li bilit i d ft Activity 6:Activity 6: Allocate reliability among acquired software, components, hardware and other systems Determine which systems and components are involved and how Determine which systems and components are involved and how
they affect the overall system reliability
Activity 7:Activity 7: Engineer to meet reliability objectivesyy g y j Plan using fault tolerance, fault removal and fault avoidance
Activity 8:Activity 8: Focus resources based on operational profile Operational profile guides the designer to focus on features that are
supposed to be more critical Develop more critical functions first in more detail Develop more critical functions first in more detail
Implementation PhaseImplementation PhaseA i i 9A i i 9 i i i f i Activity 9:Activity 9: Measure reliability of acquired software, hardware and other systems Certification test using reliability demonstration chart
Activity 10:Activity 10: Manage fault introduction and propagation Practicing a development methodology; constructing
modular system; employing reuse; conducting inspection and review; controlling change
System Test PhaseSystem Test PhaseA i i 11A i i 11 i i fi Activity 11:Activity 11: Determine operational profile used for testing Decide upon critical operations Decide upon need of multiplicity of operational profile
Activity 12: Activity 12: Conduct reliability growth testingConduct reliability growth testing Activity 13:Activity 13: Track testing progress and certify yy g p g y
that reliability objectives are met Conduct feature test, regression test and performance and Conduct feature test, regression test and performance and
Field Trial PhaseField Trial PhaseA i i 14A i i 14 j i i i Activity 14:Activity 14: Project additional testing needed Check accuracy of test: time and coverage Plan for changes in test strategies and methods
Activity 15:Activity 15: Certify that reliability objectives and release criteria are met Check accuracy of data collection Check whether test operational profile reflects field
operational profile Check customer’s definition of failure matches with what
Post Delivery Phase /1Post Delivery Phase /1A ti it 16A ti it 16 P j t t l t ff d Activity 16:Activity 16: Project post-release staff needs Customer’s staff for system recovery; supplier’s staff to
handle customer reported failures and to remove faultshandle customer-reported failures and to remove faults Activity 17:Activity 17: Monitor field reliability vs.
objectivesobjectives Collect post release failure data systematically
Activity 18:Activity 18: Track customer satisfaction with Activity 18:Activity 18: Track customer satisfaction with reliability Survey product features with a sample customer set Survey product features with a sample customer set
Post Delivery Phase /2Post Delivery Phase /2A ti it 19A ti it 19 Ti f t i t d ti b Activity 19:Activity 19: Time new feature introduction by monitoring reliability
New features bring new defects Add new features New features bring new defects. Add new features desired by the customers if they can be managed without sacrificing reliability of the whole systemg y y
Activity 20:Activity 20: Guide product and process improvement with reliability measures Root-cause analysis for the faults Why the fault was not detected earlier in the development
h d h h ld b d d h b biliphase and what should be done to reduce the probability of introducing similar faults
Feasibility Phase: BenefitsFeasibility Phase: BenefitsA ti it 1 d 2A ti it 1 d 2 D fi d l if f il Activity 1 and 2:Activity 1 and 2: Define and classify failures, identify customer reliability needs
Benefits:Benefits: Release software at a time that meets customer Benefits:Benefits: Release software at a time that meets customer reliability needs but is as early and inexpensive as possiblep
Activity 3:Activity 3: Determine operational profilesActivity 3:Activity 3: Determine operational profiles Benefits:Benefits: Speed up time to market by saving test time,
reduce test cost, have a quantitative measure for reliability
Design Phase : BenefitsDesign Phase : BenefitsA ti it 6A ti it 6 All t li bilit i d ft Activity 6:Activity 6: Allocate reliability among acquired software, components, hardware and other systems Benefits:Benefits: Reduce development time and cost by striking better p y g
balance among components
A ti it 7A ti it 7 E i t t li bilit bj ti Activity 7:Activity 7: Engineer to meet reliability objectives Benefits:Benefits: Reduce development time and cost with better design
Activity 8:Activity 8: Focus resources based on operational profile Benefits:Benefits: Speed up time to market by guiding development priorities, p p y g g p p
A ti it 9A ti it 9 M li bilit f i d Activity 9:Activity 9: Measure reliability of acquired software, hardware and other systems
Benefits:Benefits: Reduce risks to reliability schedule cost from Benefits:Benefits: Reduce risks to reliability, schedule, cost from unknown software and systems
Activity 10:Activity 10: Manage fault introduction and propagationpropagation Benefits: Benefits: Maximize cost-effectiveness of reliability
System Test Phase : BenefitsSystem Test Phase : BenefitsActivity 11:Activity 11: Determine operational profile used Activity 11:Activity 11: Determine operational profile used for testing Benefits:Benefits: Reduce the chance of critical operations going Benefits:Benefits: Reduce the chance of critical operations going
unattended, speed up time to market by saving test time, reduce test cost
Activity 12:Activity 12: Conduct reliability growth testingConduct reliability growth testing Activity 12: Activity 12: Conduct reliability growth testingConduct reliability growth testing Benefits: Benefits: Determine how the product reliability is
improving. p g Activity 13:Activity 13: Conduct reliability growth testing,
track testing progress Benefits:Benefits: Know exactly what reliability the customer
would experience at different points in time if the software is released at those points
Field Trial Phase : BenefitsField Trial Phase : BenefitsA ti it 14A ti it 14 P j t dditi l t ti d d Activity 14:Activity 14: Project additional testing needed Benefits:Benefits: Planning tests ahead in time when the
reliability measure is not satisfactory will reduce the timereliability measure is not satisfactory will reduce the time for integration and release.
Activity 15:Activity 15: Certify that reliability objectives are metmet Benefits:Benefits: Release software at a time that meets customer
reliability needs but is as early and inexpensive as possible; verify that the customer reliability needs are actually met
Activity 17Activity 17--18:18: Monitor field reliability vs Activity 17Activity 17 18:18: Monitor field reliability vs objectives, track customer satisfaction with reliabilityreliability Benefits:Benefits: Maximize likelihood of pleasing customer with
Post Delivery Phase: BenefitsPost Delivery Phase: Benefits
Activity 19:Activity 19: Time new feature introduction by monitoring reliability Benefits:Benefits: Ensure that software continues to meet customer
reliability needs in the field
Activity 20:Activity 20: Guide product and process yy p pimprovement with reliability measures Benefits: Benefits: Maximize cost-effectiveness of product and p
Example (cont’d)Example (cont’d)A h h d i D 13th d l Assume that the current date is Dec 13th and currently one tester assigned to this project. We want to bring the test execution back on plan in the next 10 working days Howexecution back on plan in the next 10 working days. How many testers do we need to hire for this project, assuming that the plan for the next 10 days is the execution of 12 tests p yper day?In 10 working days, the team needs to complete 10 × 12 = 120 tests t t h th l d t T t ti i tl 156 134 22to match the planned rate. Test execution is currently 156-134 = 22 tests behind the goal. This means 120+22=142 tests in 10 days to accomplish. Using the average rate of about 4 tests per day calculated above, 3 testers would only complete 120 (3 testers × 4 tests/day ×above, 3 testers would only complete 120 (3 testers 4 tests/day 10 days =120) tests in that time, which is less than what is needed. However, if we hire 4 testers they can complete 160 (4 testers × 4 tests/day × 10 days) which is a bit above the need. Therefore 4-1=3
Existing vs. New ProjectsExisting vs. New ProjectsTh i i l diff b d i i There is no essential difference between new and existing projects in applying SRE for the first time. However, determining failure intensity objective and operationaldetermining failure intensity objective and operational profile for existing projects is easier.
Most of the SRE activities will require only small updatesMost of the SRE activities will require only small updates after they have been completed once, e.g., operational profile should only be updated for the new operations added. (remember interaction factor)
After SRE has been applied to one release, less effort is d d f di l h ldneeded for succeeding releases, e.g., new test cases should
ShortShort--Cycle ProjectsCycle ProjectsS ll j t l th ith h t Small projects or releases or those with short development cycles may require a modified set of SRE activities to keep costs low or activitySRE activities to keep costs low or activity durations short.
Reduction in cost and time can be obtained by Reduction in cost and time can be obtained by limiting the number of elements in the operational profile and by accepting less precision.profile and by accepting less precision.
Examples:Examples: Setting one operational mode and performing certification test rather than reliabilityperforming certification test rather than reliability growth test.
Conclusions …Conclusions …Practical implementation of an effective SRE Practical implementation of an effective SRE program is a non-trivial task.
Mechanisms for collection and analysis of data on Mechanisms for collection and analysis of data on software product and process quality must be in place.
Fault identification and elimination techniques must be in place. O h i i l bili i h h f Other organizational abilities such as the use of reviews and inspections, reliability based testing, and software process improvement are alsoand software process improvement are also necessary for effective SRE.
Quality oriented mindset and training are necessary!