This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Software processes are not repetitive in the same sense as a manufacturing process:
driven by the changing “constraints of operation”.
In a manufacturing process:– the same controlled process is applied repetitively on raw materials yielding a finished product,– controlled variation of input parameters to optimize the yield or quality of the output is possible, – major change is uncommon; change occurs only when new machinery, or new material processing requirements, or new products are required.
For software process improvements to be effective, it is important to identify the “pain-points” and isolate the “real root-cause” – which is the difficult part (controllable factors)
Some Perspectives …
Why a blind application of Six Sigma might be wrong? (2 of 5)
Noteworthy distinctions … (contd..)A software process based on the CMMI is prone to numerous in-process and out-of-process variations
Some Perspectives …
Why a blind application of Six Sigma might be wrong? (3 of 5)
New Definition: Common cause variations, have a source of variation assignable to
something within the process control and,
whose impact can be isolated, corrected
and possibly eliminated with effective
“process management” practices.
New Definition: Special cause variations, have a source of variation assignable to
something outside the process control and,
whose impact can be resolved or minimized
with effective “project management”practices.
- Lack of domain understanding, - Internal churn resulting from people leaving midway, - Lack of budgets and resources, - Lack of appropriate tools
- Changing customer requirements, - Lack of access to the validation environment,- Interoperability issues with a legacy application,- Escape defects from the legacy
Widely accepted that “process quantification and process performance management” in CMMI, occurs only at levels 4 and 5
Can this focus shift downwards to the level 2?Note that, process measurement and analysis is addressed at level 2 itselfHowever, CMMI does not provide the required rigor of practices at level 2 for analysisHence, the 4-step process that–
defines the required analysis practices,serving as a strong foundation for high-maturity practice implementation,to improve the rigor of analyzing data,whereby, accelerating high-process maturity institutionalization
Some Perspectives …
Why a blind application of Six Sigma might be wrong? (5 of 5)
What is the 4-step process?Establish that the process is stable & capableUnderstand the underlying process distributionFor a stable process identify the regressions and the residualsGeneralize the regression into a model
practices of the CMMI constellationsWhen business goals are defined quantitatively, Six Sigma alongside CMMI has proven useful in minimizing the cycle time to transition maturity levelsA systematic and a consistent integrated approach of Six Sigma with CMMI can help in rapid acceleration of maturity levels from chaotic or level 1 mode to an optimizing or level 5 mode of operation in about 3-years timeframeWithout Six Sigma, it would typically takes anywhere between 12-18 months per level of maturity
Typical organizational contexts where such dramatic results were observed–
small- to medium-sized organizations or business units with less than 300 individuals in size,with annual staff attrition rates less than 5% with negligible impact to competency erosion,projects executed on mature domains/technologies with sufficient organizational memory of execution, good understanding of project requirements with an ability to manage requirements changes to within 10% effort deviation in the requirements-phasewhen at least every individual has a mandatory white-belt certification within the first 6 monthsupwards of 50% have either a green-belt or a black-belt certification
Essence of Six Sigma methodology in the CMMI context …
Uses the DMAIC approach (Define-Measure-Analyze-Improve-Control)by identifying business goals in quantitative terms,using voice of customer (VoC) that,establishes performance criteria or, the critical to quality (CTQ) requirements,adding value to the customer
Also ensures:Identification of critical business requirements that are in alignment with the organizational goals,by gathering voice of business (VoB) that, are critical to process (CTP)
Essence of high-maturity focus or mindset in the CMMI context can be summarized as …
No conclusions can be made without objective proof.The proof is in the data.There is no guarantee that a process change, contributes towards improvements unless verified and validated statistically.
Establish that the process is stable & capableUnderstand the underlying process distributionFor a stable process identify the regressions and the residualsGeneralize the regression into a model
is stable and capable–Key Point: every stable and capable process, can be made more stable and more capableUse control charts/time-series charts to analyze outliersExample: 5-Whys to arrive at a process causeKey Point: When using sample data, consider the mean value of a large number of observations from independent homogenous projects Central Limit Theorem → normal distribution
process distribution–Key Point: Anderson-Darling test of normality is the most widely used test by statistical software (E.g., Minitab)High p-value shows no statistically significant departure from normalityNull hypothesis is that the data is normal – so a high p-value > 0.05 is indicative of a normal distributionKey Point: If the “statistical mean” of derived measures of a number of homogenous projects is used, there is no need for the normality test
Key Point: Every derived measure uses some combination of the basic metric dataKey Point: A derived measure can itself be a combination of several phase-wise factors (sub-process control)
(3 of 4)
3. For a stable process identify the regressions and the
residuals–Key Point: Regressions or, the factors are picked based on process knowledge, cost, risks and ease of measurementKey Point: Residuals are a measure of the error in a model; difference between a fit (predicted value) and the actually observed value
Key Point: Statistical models help to tie process inputs with the process output using statistically relevant data
Key Point: Models establish a structure using which, estimation and hypothesis testing are enabled
Key Point: Model can be deterministic or predictiveDeterministic: simple math equation: E = mc2
Probabilistic: Uses underlying probability distribution/density functions of random variables that defines the dependent variableExample: Z = A + P + F (for one sample of a project)What if there are 10 samples from 10 different projects?
For each project, Z = A + P + F may still be valid
Taken together, Z = 0.199 + 0.658*A + 0.789*P + 0.345 F+ eRegression model must be iteratively improved by fine-tuning it using the “actual” valueUsing large samples makes the regression equation better
Which basic metric and derived measure should we consider in the 4-step process?
Related to this question are the following questions –Why do we need to understand this measure? What do we do with this understanding once we have gained it?What type of a “process input” does this measure cater to? Is this a –
To answer these questions meaningfully –A thorough design and understanding of the process is required,because, it is in the lifecycle phases that, time is consumed, errors or defects injected and detected
Key point: Measurement process should therefore be an integral part of the development lifecycle
Key point: Data must be collected and analyzed in real-time using simple techniques for it to be useful
Key point: There is little meaning in using software data analysis as in a typical post-mortem – after the fact
Step 1: Establish that the process is stable and capable
Cost of poor quality (%): Ratio of the sum of internal and external failure fixing effort (major and minor fault) over total project effortEach observation is a rolled up data (mean) of different projects in execution after establishing data homogeneityFor outliers (point 4), determine cause using a structured 5-Whys approach
Step 2: Understand the underlying process distribution
Anderson-Darling test of normality is the most widely used test by statistical software Normality is required for continuous data prior to measuring the process capability Is the confidence interval for mean acceptable?
Step 3: For a stable process identify the regressions and the residuals
(3 of 4)
Use One-way ANOVA for analysis of different phase-wise contributors
P-value is < 0.05; reject the null hypothesis that the mean COPQ % of the different phases are equalIdentify the regressions for improvements (CUT and Test)
establish an early focus on measurement and analysis of software data with a Six Sigma mindset,
based on emphasizing the identification of business objectives,
while considering the voice of business (CTP) and voice of customer (CTQ) was explored
This approach emphasizes identification of “pain-points” and the “real-root causes” to target the quantification and improvement effort (using controllable factors)
[Chrissis 2011] Chrissis, Mary Beth et al, “CMMI for Development®: Guidelines for Process Integration and Product Improvement, Third Edition”, Addison-Wesley Professional. 2011
SITARA Technologies Pvt. Ltd.#54, Sri Hari Krupa6th Main RoadMalleswaramBangalore KA 560 003Telephone: +(91-80) 2334-3222Mobile: + 984-523-3222Email: [email protected]