Sponsored by the U.S. Department of Defense © Carnegie Mellon University Pittsburgh, PA 15213-3890 Stephen E. Cross, Ph.D. Director and CEO Software Engineering Institute [email protected] www.sei.cmu.edu 412-268-7740 Software Engineering Institute
Sponsored by the U.S. Department of Defense© Carnegie Mellon University
Pittsburgh, PA 15213-3890
Stephen E. Cross, Ph.D.Director and CEOSoftware Engineering [email protected]
Software Engineering Institute
SEI Overview - page 2© 2002 by Carnegie Mellon University
This Briefing Refers to the Following Service Marks and Trademarks
® Capability Maturity Model, Capability Maturity Modeling, Capability Maturity Model for Software, CMMI, CERT, and CERT Coordination Center are registered in the U.S. Patent and Trademark Office by Carnegie Mellon University.
SM CMM Integration; IDEAL; Personal Software Process; PSP; SCAMPI; SCAMPI Lead Assessor; SCAMPI Lead Appraiser; Team Software Process; TSP; Architecture Tradeoff Analysis Method; and ATAM are service marks of Carnegie Mellon University.
SEI Overview - page 3© 2002 by Carnegie Mellon University
What I’d Like to Share With You
Brief overview of Carnegie Mellon and the SEI
Overview of SEI’s body of work
Some of my own ideas for future research
Summary
SEI Overview - page 4© 2002 by Carnegie Mellon University
Software Engineering Institute
Carnegie Institute of Technology
College of Fine Arts
College of Humanities and Social Sciences
Graduate School of Industrial Administration
H. John Heinz III School of Public Policy and Management
Mellon College of Science
School of Computer Science
Carnegie Mellon University: Major Units
Main campus: Pittsburgh PA (USA)West coast campus: San Jose CA (USA)
SEI Overview - page 5© 2002 by Carnegie Mellon University
Software Engineering Institute
Applied R&D laboratory situated as a college level unit at Carnegie Mellon University, Pittsburgh PA (USA)
Established in 1984
Additional offices in in Arlington VA and Frankfurt Germany
Staff size of 335
Sponsored by US Government and industry
Mission: Improve the practice of software engineering
SEI Overview, 23 Mar 2001 - Page 5
SEI Overview - page 6© 2002 by Carnegie Mellon University
Acquirers & Developers
Research Community
SEI’s Role (Transition)
Research Community
Acquirers & Developers
Sustain • sustain what is adopted
Outreach • facilitate adoption and use
Mature• documentation and packaging• analysis of trial use• identify and mature new practices
Helping othersimprove their softwareengineering practices
Partners
SEI
SEI Overview - page 7© 2002 by Carnegie Mellon University
From a Recent “Top 10 Defects” List
Finding and fixing a software problem post delivery is 100x more expensive than finding and fixing it during the requirements and design stage.
Current software projects spend 40 to 50% of their time on avoidable work.
Peer reviews catch 60% of the defects.
Disciplined personal practices can reduce defect introduction rates up to 75%.
About 40 to 50 % of user programs contain nontrivial defects.
Ref: Boehm, B., and Basili, V. “Software Defect Reduction Top 10 List,” Computer, January 2001, p. 135-137.
SEI Overview - page 8© 2002 by Carnegie Mellon University
State of Practice & the SEI Vision
* move to the left !
* reuse everything
* never make the same mistake twice*Ref: Standish Group, www.standishgroup.com, 1999
Development Integration and System Test
Software state of practice (“test in” quality)
World-class developers “design in” quality
60 - 80 % of effort and cost
SEI Overview - page 9© 2002 by Carnegie Mellon University
What I’d Like to Share With You
Brief overview of Carnegie Mellon and the SEI
Overview of SEI’s body of work
Some of my own ideas for future research
Summary
SEI Overview - page 10© 2002 by Carnegie Mellon University
SEI Technical Program
Management Practice
Initiatives
Capability MaturityModel
Integration
Accelerating Software
Technology Adoption
COTS-Based Systems
Performance Critical Systems
Architecture Tradeoff Analysis
Technical Practice
Initiatives
Team Software Process
Software Engineering
Measurement & Analysis
Survivable Systems
Product Line Practice
Predictable Assembly
with Certifiable
Components
The right software delivered defect free, on cost, on time, every time
High confidence, evolvable,product lines
with predictable and improvedcost, schedule, and quality
SEI Overview - page 11© 2002 by Carnegie Mellon University
Quality Process Models
Quality process models, such as the CMMI® models, support the design and improvement of the key software and system process competencies required to build today’s complex systems
Initial
Repeatable
Defined
Managed
Optimizing
1
2
3
4
5 Productivity
Risk
SEI Overview - page 12© 2002 by Carnegie Mellon University
Software Process ImprovementSoftware Estimates
.
0 %
140%
-140%
...
.
.
..
..
.
..
.
. .
. . . .
.. . .
. .
.
.
. . . .. .. . . . . .... . . .. ..
. .. ..
.
.
.
. .. .. ...... . .. . ... . ..... .
.
(Efforts = Labor Hours)
Ove
r/U
nd
er P
erce
nta
ge
.
.. . .
.
.
. .
..
.
. .
..
..
. .
..
.. .. . .. . . . . .. . . . . .. .
... . .. . .
. . . .. . . .. . . . .. . . . .
. . . . .. . . . . .. . . . .. . . . . .
. . . . . .. . . . .. . .
. . .. . . . .
. . . . .. . . .
. . . . . .. . . . . .
. . . . . .
•Post Release Defects
•15
•10
• 5
• 0
• Ave
rag
e N
um
ber
of
Def
ects
/Klo
c
Productivity
1992 1993 1994 1995 1996
100
80
60
40
20
0
Ave
rag
e N
um
ber
of
Ho
urs
- 26%
- 38%
- 62%- 12%
Increased Productivity
Cycle time
1992 1993 1994 1995 1996
100
80
60
40
20
36% Faster
Level 1 Level 2 level 3
Time
•Without Historical Data •With Historical Data
Scott Griffin, Boeing CIO, keynote talk at SEPG 2000
SEI Overview - page 13© 2002 by Carnegie Mellon University
PSP Results
111098765432100.0
0.2
0.4
0.6
0.8
1.0
1.2
1.4
Design
Code
Compile
Test
Time Invested Per (New and Changed) Line of Code
Program Number
Mean
Min
ute
s S
pen
t P
er
LO
C
Ref: W. Hayes, J. Over, Personal Software Process (PSP): An Empirical Study of the Impact of PSP on Individual Engineers (CMU/SEI-97-TR-001). See:http://www.sei.cmu.edu/publications
SEI Overview - page 14© 2002 by Carnegie Mellon University
TSP Results
Average Schedule Deviation - Range
-20%
0%
20%
40%
60%
80%
100%
120%
140%
160%
Pre TSP/PSP With TSP/PSP
Average Effort Deviation - Range
-20%
0%
20%
40%
60%
80%
100%
120%
Pre TSP/PSP With TSP/PSP
Defects/KLOC in Acceptance Test - Range
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
Pre TSP/PSP With TSP/PSP
System Test Duration (Days / KLOC) - Range
0
1
2
3
4
5
6
7
Pre TSP/PSP With TSP/PSP
http://www.sei.cmu.edu/publications/documents/00.reports/00tr015.html
SEI Overview - page 15© 2002 by Carnegie Mellon University
Spreading the Architecture Word
Software architectureconcepts
Software architectureevaluation
ArchitectureReconstruction
Architecturedocumentation
Software Architecture in Practice
Software Architecture Familiarization
ATAM Evaluator Training
Architecture Reconstruction
Books Courses
Documenting SoftwareArchitectures
SEI Overview - page 16© 2002 by Carnegie Mellon University
Product Line Practice
Use of a common
asset basein production of a related
set of products
Architecture Production Plan Scope DefinitionBusiness Case
SEI Overview - page 17© 2002 by Carnegie Mellon University
SEI Product Line Practice Framework
Web-based, evolving, community-authored document
Describes product line essential activities
Describes essential and proven product line practices in the areas of software engineering technical management organizational management
http://www.sei.cmu.edu/plp/framework.html
SEI Overview - page 18© 2002 by Carnegie Mellon University
Examples of Product Line Practice - 1
CelsiusTech - onboard ship systemshardware-to-software cost ratio changed from 35:65 to 80:20
Motorola - FLEXworks Project (family of one-way pagers)4x cycle-time improvement80% reuse
Hewlett Packard - printer systems2-7x cycle-time improvement (some 10x)Sample project
–shipped 5x number of products–that were 4x as complex–and had 3x the number of features–with 4x products shipped/person
SEI Overview - page 19© 2002 by Carnegie Mellon University
Examples of Product Line Practice - 2
Cummins Engine Co. - engine control systemssystem build and integration went from roughly 1 year to 1
week5.5 years in product line developmentmore than 20 products successfully launched
Nokia - mobile phoneswent from 4 different phones produced per year to 50 per
year
SEI Overview - page 20© 2002 by Carnegie Mellon University
For more information
SEI Overview - page 21© 2002 by Carnegie Mellon University
What I’d Like to Share With You
Brief overview of Carnegie Mellon and the SEI
Overview of SEI’s body of work
Some of my own ideas for future research
Summary
SEI Overview - page 22© 2002 by Carnegie Mellon University
Effort => (Experience * Quality * Size) Process
most individuals and teams
lack softwareexperience
large projectshave more
challenges thansmaller ones
much time and costis wasted doing
rework
Adapted from: Royce, W. Software Project Management: A Unified Framework. New York: Addison-Wesley, p. 23.
inadequate processes (e.g., requirements)
Summary of Today’s Trouble Spots
SEI Overview - page 23© 2002 by Carnegie Mellon University
Enhancing experience
Sharing best practices
Knowledge management
Context relevant training (e.g., TSP)
Living it (e.g., flight simulators)
SEI Overview - page 24© 2002 by Carnegie Mellon University
Organization Simulation (OrgSim) Organization Simulation (OrgSim) ConceptConceptAn immersive environment that:
Simulates future organizations, including likely cross-organizational interactions
Enables decision makers to interact within & across organizational cultures
Synthesizes “people” who behave as if pickup organization is already deployed
Provides compelling feel for “what it will be like”
SEI Overview - page 25© 2002 by Carnegie Mellon University
Overall OrgSim ApproachOverall OrgSim Approach
Immersing decision makers in the possible futures
Enabling decision makers to act in these possible futures
Providing participants – synthespians – that react to decision makers’ actions
SEI Overview - page 26© 2002 by Carnegie Mellon University
Central ChallengesCentral Challenges
Needs for rapidly responding “pickup” organizations focused on issues and activities that cross jurisdictional boundaries and provide many opportunities for governance conflicts and gaps
Scenarios that inevitably involve unforeseen, threats, events, locations, and needs -- requiring, in turn, involvement of organizations not beforehand perceived to be relevant
SEI Overview - page 27© 2002 by Carnegie Mellon University
OrgSim Project GoalsOrgSim Project Goals
Facilitate rapid design of pickup organizations and evaluation of such designs by enabling experiencing these organizations before fielding them
Facilitate identifying cross-organizational governance problems prior to depending on cross-organization functions, permitting rapid redesign and reevaluation
SEI Overview - page 28© 2002 by Carnegie Mellon University
Orgsim Domains
Software engineering
Supply chain management
Emergency response to terrorist events
SEI Overview - page 29© 2002 by Carnegie Mellon University
OrgSim TeamOrgSim Team
Carnegie-Mellon UniversitySchool of Computer ScienceHeinz School of Management and Public PolicyCollege of Humanities and Social SciencesSoftware Engineering Institute
Georgia TechSchool of Industrial & Systems EngineeringCollege of ComputingIvan Allen College of Liberal ArtsGeorgia Tech Research Institute
SEI Overview - page 30© 2002 by Carnegie Mellon University
User Interface, e.g., Large Screens, Voice, Gestures
Organizational Story, e.g., ‘Active scenario’
Characters, e.g., User, Manager, QA Expert
World Model, e.g., City, Industry, Media
Distributed Simulation Software
Hardware, e.g., Computers, Networks
SEI Overview - page 31© 2002 by Carnegie Mellon University
Questions?
Please contact us
www.sei.cmu.edu
Steve [email protected]
412-268-7740703-908-8230