Slow Down to Speed Up – Leveraging Quality to Enable Productivity and Speed Fran O’Hara – Inspire Quality Services [email protected] www.inspireqs.ie © 2014 Inspire Quality Services 1
Dec 02, 2014
1
Slow Down to Speed Up – Leveraging Quality to Enable Productivity
and Speed
Fran O’Hara – Inspire Quality [email protected]
www.inspireqs.ie
© 2014 Inspire Quality Services
2
Agenda
• Quality and Speed/Productivity
• Lessons learnt/challenges
© 2014 Inspire Quality Services
3
Resources Schedule
Scope/Requirements
Plan Driven
FIXED
ESTIMATED
Resources Schedule
Scope/Requirements
Value Driven
QualityQuality
Flipping the Iron Triangle
?
4
Quality <-> Speed
(short term)Quick and Dirty is Faster
Bad Quality slows you down (long term)
Going faster gives better quality (short & long term)
5
Economics of Product DevelopmentCycle Time
How Long it Takes to get Through the Value Stream
• Economies of Speed/Cost of Delay• Fast Feedback & Learning (Empirical)• Waste/Cost Reduction & Agility
Unit/Post-Dev CostCost of Deploying, Configuring,
Supporting, Using each ‘instance’• Ease of Use, Robustness• Cost of Configuration/Administration• Browser/Platform/OS Support
Development ExpenseDevelopment Project Costs
• Cost of Engineering Team• Dev Tools: SCM, CI, AutoTest• Team Management & Facilities• Shared Services (HR, Finance,etc.)
Product ValueProfit from a Software Product; Savings
from Internal IT Project• Sales Revenue (Volume * Price)• Cost Savings• Strategic Value
Adapted from Don Reinertsen, 2009
6
go fast have quality we must To
Faster is better…
7
Technical DebtSymptoms of technical debt• Bugs found in production• Incomprehensible, un-maintainable code• Insufficient or un-maintainable automated tests• Lack of CI• Poor internal quality• Etc.
© 2014 Inspire Quality Services
8
Quality & Test
• Quality is not equal to test. Quality is achieved by putting development and testing into a blender and mixing them until one is indistinguishable from the other.
• Testing must be an unavoidable aspect of development, and the marriage of development and testing is where quality is achieved.
from ‘How google tests software’, James Whittaker et. al.
© 2014 Inspire Quality Services
9
Lessons Learnt / Challenges
Test Automation
Line Management
Definition of Done
Test Competency
Requirements(e.g. Story size,
Non-Fn)
Test Strategy & Risk
Techniques (e.g. exploratory),
Planning for Quality, Documentation, …..
10
Basic Testing within a Sprint
Automated Acceptance/Story
based Tests
Automated Unit Tests
Manual Exploratory
Tests
Represent Executable requirements
Represent Executable Design specifications
Provides Supplementary
feedback
Test Strategy &
Risk
© 2014 Inspire Quality Services
11
Agile Testing Quadrants – Risk!
© 2014 Inspire Quality Services
Test Strategy & Risk
12
Definition of ‘Done’
• An agreement between PO and the Team– Evolving over time to increase quality & ‘doneness’
• Used to guide the team in estimating and doing• Used by the PO to increase predictability and
accept Done PBIs• ‘Done’ may apply to a PBI and to an Increment• A single DoD may apply across an organisation, or
a product– Multiple teams on a product share the DoD
Definition of Done
© 2014 Inspire Quality Services
13
DoD example
Story level• Unit tests passed, • unit tests achieving 80%
decision coverage,• Integration tests passed• acceptance tests passed with
traceability to story acceptance criteria,
• code and unit tests reviewed,
• static analysis has no important warnings,
• coding standard compliant, • published to Dev server
Sprint level• Reviewed and accepted by
PO, • E-2-E functional and feature
tests passed• all regression tests passing, • exploratory testing
completed, • performance profiling
complete, • bugs committed in sprint
resolved, • deployment/release docs
updated and reviewed, • user manual updated
Release level• Released to Stage server, • Deployment tests passed, • Deployment/release docs
delivered, • large scale integration
performance/stress testing passed
Definition of Done
© 2014 Inspire Quality Services
14
The Automation Pyramid
Unit/Component layerDeveloper Tests
e.g. JUnit
API/Service layerAcceptance Tests
e.g. Fitnesse, Cucumber
GUI layere.g. Selenium
Manual Tests e.g. exploratory
Automate at feature/workflow level
Automate at story level
Automate at design level
Based on Mike Cohn
Test Automation
15
Development Team(Analysts, Progmrs., Testers, Architect, DBA, UI/UX, etc)
You only need a Hero in a Crisis – so
No Heroes!
Deep
Broad
Programming Testers Testing Programmers
Architect
Team Lead
Developer1
Developer2
QA Lead
Tester1
Tester2
BA Lead
BA1
BA2
Create each increment of ‘Done’ Product
No Specialised Sub-Teams
Test Competency
?
16
Is testing fully integrated?
Code Code Code & Bug Fix
Test
Sprint 1 Sprint 2
Code
Test
Sprint 1 Sprint 2
Code & Bug Fix Code
Test
Code & Bug Fix
Code & Bug Fix
Test
Sprint 1 Sprint 2
Code & Bug Fix
Test
A
B
C
Requirements(e.g. Story
size, Non-Fn)
17
Examples of how to evolve quality/test practices…
• See Google’s ‘Test Certified’ levels• Paddy Power’s review of teams practices – using scale of 0-5 for items such as
– Code Reviews, – Pair Programming, – Code Analysis, – Unit Tests, – Continuous Integration, – Automated Acceptance Tests, – Data Generation, – Performance Analysis, – TDD, etc.
(from Graham Abel, Softtest Test Automation Conference 2013)
• Communities of Practice, Tribes, Mentors, etc.
© 2014 Inspire Quality Services
18
Fran O’HaraInspireQS
Thank You!
© 2014 Inspire Quality Services