Metrics based software supplier selection - Best practice used in the largest Dutch telecom company
Post on 17-Jan-2015
226 Views
Preview:
DESCRIPTION
Transcript
Metrics Based Software Supplier Selection
Best practice used in the largest Dutch telecom company
Assisi, October 2012
Hans KuijpersHarold van Heeringen
Metrics Based Software Supplier Selection
Sogeti Nederland BV:
Senior Consultant Software Metrics
ISBSG: President
NESMA: Board Member
NESMA: Working Group Chair COSMIC
NESMA: Working Group Chair Benchmarking
COSMIC: IAC Member
KPN Nederland:
Manager Metrics Desk
Certified Scope Manager
QSM: Special Interest Group Agile
Pagina 3
Introduction
Harold van HeeringenSizing, Estimating & ControlHarold.van.heeringen@sogeti.nl @haroldveendam @Sogeti_SEC
Hans KuijpersProgram Assurance & Methodshans.tm.kuijpers@kpn,.com @_hanskuijpers
Agenda
• Introduction and Context
• Phases and Timeline
• The Model
• Results
• Conclusions & Recommendations
Metrics Based Software Supplier Selection - 4 -
Introduction and Context
.Why supplier selection?
• Consolidation # suppliers• Cost reduction• Supplier acts as SI & MSP• 5 Year investment• SPM is a best practice at KPN
Metrics Based Software Supplier Selection - 5 -
KPN Board
Consumer Market
Business Market
Corporate Market NetCo
Fixed
OSS
Mobile Customer Experience
BSS Generic & Traditional
Wholesale IT Operations
E-Plus KPN Belgium
Revenues: €13.000mEBITDA: € 5.100mEmployees: 31.000
Domains
Problem: no more competition between suppliers
instrument needed to avoid excessive cost Unit of Work pricing
Agenda
• Introduction and Context
• Phases and Timeline
• The Model
• Results
• Conclusions & Recommendations
Metrics Based Software Supplier Selection - 6 -
Phases and Timeline
Why Productivity metric added?• Objective selection criteria• Supplier willingness to show their transparency• Basis for productivity baseline• Insight in quality level• Negotiations for year on year cost reduction• Relation to continous improvement steps
Metrics Based Software Supplier Selection - 7 -
Requested project information
• Data of 6 historical projects, max 3 KPN projects• In scope of current technology domain• Range 300 – 1000 FP• Sizing method NESMA 2.1 or IFPUG 4.x• DCF must be completely filled out• No other template is allowed
In BAFO phase suppliers should show evidence of the size and productivity figures by releasing FPA-reports, Data Collection Forms and/or insight in their administrative systems.
Metrics Based Software Supplier Selection - 8 -
Historical Project Data form (1)
Metrics Based Software Supplier Selection - 9 -
Historical Project Data form (2)
Metrics Based Software Supplier Selection - 10 -
Per data field requirements are mentioned in the
template.
Agenda
• Introduction and Context
• Phases and Timeline
• The Model
• Results
• Conclusions & Recommendations
Metrics Based Software Supplier Selection - 11 -
The Model
Characteristics:
• Degree of openess and compliancy• Completeness and cohesion of submitted data• Productivity benchmark against each other and industry• Delivered Quality• During the RFP phase the data will be considered as correct, but will be
checked on reality
The 3 test criteria:
A. Compliancy value (10%)B. Reality value (30%)C. Productivity - Quality value (60%)
Metrics Based Software Supplier Selection - 12 -
Used metrics and benchmarks
Project Delivery Rate (PDR) = spent project effort related to function point (h/FP)
Productivity Index (PI) = metric from QSM, derived from size, duration and effort
Quality: delivered defects per FP
Benchmarks:
• PI against the QSM Business trend line• PDR against ISBSG Repository• Adjusted = normalised to Construction+Test activities
Metrics Based Software Supplier Selection - 13 -
Compliancy value (10%)
Suppliers start with 10 points
The compliancy value is substracted with 2 points for each violation of rule:
a)Range 300 – 1000 Function Pointsb)Method NESMA 2.1 or IFPUG 4.xc)Each field of “Historical Project Data”-form must be filled out
Maximum value = 10, minimum value = 0
Metrics Based Software Supplier Selection - 14 -
Reality value (30%)
Unrealistic projects are discarded from further analysis:
• PI > +2 sigma (95%)• PDR < P25 ISBSG (best in class
projects)
The reality value is substracted with 2 points for each unrealistic project
Maximum value = 10Minimum value = 0
Metrics Based Software Supplier Selection - 15 -
PI vs. Functional size (FP)
50 150 250 350 450 550 650 750 850 950
Effective FP
0
5
10
15
20
25
30
35
PI
Supplier ASupplier BSupplier CSupplier DSupplier EQSM BusinessAvg. Line Sty le2 Sigma Line Sty le
Unrealistic
Functional Size (FP)
Productivity - Quality value (60%)
Metrics Based Software Supplier Selection - 16 -
ID PDR (h/FP) PDR ISBSG median PDR score
7 5,9 8,6 -2,7
8 6,0 8,6 -2,6
9 6,9 8,6 -1,7
11 6,2 8,6 -2,4
12 7,3 8,6 -1,3
-2,1Average:
ID Defects/FP Quality score
15 41,7
18 13,9
21 66,7
22 4,0
23 10,0
13,9Median
The highest average most points
The lowest average most points The lowest median most points
(Points PI score * 0,5) + (Points PDR score * 0,3) + (Points Quality score * 0,2)
Productivity - Quality value =
For PI and PDR the average of the distance to the benchmark value is determinedFor the quality the median is dertermined
Agenda
• Introduction and Context
• Phases and Timeline
• The Model
• Results
• Conclusions & Recommendations
Metrics Based Software Supplier Selection - 17 -
Results of Compliancy (1)
Metrics Based Software Supplier Selection - 18 -
Projects discarded:• Projects on going (4)• Project sized in COSMIC (1)
Blank crucial fields• Defect data• Effort data• Dates
Other violations• Primary Language (example English)
Supplier Compliancy Value
Supplier A 0
Supplier B 0
Supplier C 4
Supplier D 0
Supplier E 0
Result: 1 supplier has 3 violations,the others 5 or more
Results of Compliancy (2)
Metrics Based Software Supplier Selection - 19 -
Results of Reality
Projects unrealistic:
• 3 according to PI• 1 according to PDR
Discarded for further analysis
Metrics Based Software Supplier Selection - 20 -
Supplier
Unrealistic projects PI criterion
Unrealistic projects PDR
criterion Reality Value
Supplier A 1 1 6
Supplier B 1 0 8
Supplier C 0 0 10
Supplier D 0 0 10
Supplier E 1 0 8
Results of Productivity / Quality
Metrics Based Software Supplier Selection - 21 -
Supplier PI scoreRank
PI scorePoints
PI score
Supplier A 3,9 2 8
Supplier B 5,0 1 10
Supplier C 3,4 3 6
Supplier D 3,0 5 2
Supplier E 3,2 4 4
SupplierPDR score
Rank PDR score
Points PDR score
Supplier A -3,2 1 10
Supplier B -2,1 2 8
Supplier C 16,6 4 4
Supplier D 6,2 3 6
Supplier E 18,3 5 2
SupplierQuality Score
Rank Quality score
Points Quality score
Supplier A 3,1 1 10
Supplier B 13,9 2 8
Supplier C 52,6 3 6
Supplier D 1000,0 5 2
Supplier E 94,6 4 4
SupplierPoints
PI scorePoints
PDR scorePoints
Quality scoreProductivity/ Quality value
Supplier A 8 10 10 9,0
Supplier B 10 8 8 9,0
Supplier C 6 4 6 5,4
Supplier D 2 6 2 3,2
Supplier E 4 2 4 3,4
weight 50% 30% 20%
+ +
=
Results of Total Assessment
Metrics Based Software Supplier Selection - 22 -
Recommendation from Metrics Desk: Supplier B and A score best in the model
SupplierCompliancy
value Reality value
Productivity/Quality value
Total Points Rank
Supplier A 0 6 9,0 7,2 2
Supplier B 0 8 9,0 7,8 1
Supplier C 4 10 5,4 6,6 3
Supplier D 0 10 3,2 4,9 4
Supplier E 0 8 3,4 4,4 5
weight 10% 30% 60%
However suppliers B and C were selected for the next BAFO phase
Findings BAFO phase
Metrics Desk investigated the provided project data of the selected suppliers B + C:• Size• Dates• Hours• Defects
Supplier B:• Resistance: confidentiality clause with clients• Client site visit
Supplier C:• Size measurement by junior not certified measurers
Metrics Based Software Supplier Selection - 23 -
Agenda
• Introduction and Context
• Phases and Timeline
• The Model
• Results
• Conclusions & Recommendations
Metrics Based Software Supplier Selection - 24 -
Conclusions and recommendations
• The productivity assessment influenced the total outcome significantly• The assessment and discussions afterwards gave insight in:
• Transparency and CMMI level• The results are used in the negotiations phase to maximize the baseline value
• Make sure the parties understand:• the purpose of the assessment• the use of the “Historical Project Data” form• that the disclosed data will be validated and should not be confidental• the consequences of violating the governance rules (e.g. penalty points)
• Because of many violations of the compliancy rules, consider 1 penalty point per violation
• Construct model beforehand, but don’t communicate the model with suppliers• Bring site visits when offered. This gives extra information next to the
productivity validation
Metrics Based Software Supplier Selection - 25 -
Conclusions:
Recommendations:
Productivity: don’t trust it, check it
Metrics Based Software Supplier Selection - 26 -
Hans KuijpersSoftware Metrics Consultanthans.tm.kuijpers@kpn,.com @_hanskuijpers
Harold van HeeringenSizing, Estimating & ControlHarold.van.heeringen@sogeti.nl @haroldveendam @Sogeti_SEC
Back-up sheets
Metrics Based Software Supplier Selection - 27 -
Productivity index(PI)
Effo
rt
Duration
Productivity Index=18
Productivity Index=16
500 FP
10 h/FP
7 h/FP
17 h/FP
14 h/FP
9 h/FP
- 28 -
top related