1
Conquering the Single Largest Challenge Facing Today’s Testers
Justin HunterCEO of Hexawise
2
Too Much to Test
&
Not Enough Time to Test It All
The Challenge
3
Testing is still in its infancy.
Other industries have spent decades and billions of dollars
learning to solve our #1 problem.
We’d be insane not to learn from them.
My Main Message
4
1. What Happened?
2. Avoidable?
3. Practical Implications
Part I. “Maps Mayhem”
5
1. Does this Stuff Really Work?
2. Adoption Trends
Part II. Making it Real
Here...Most Careers
Time
Time
Here...Scott’s Career
Time
Here...Scott’s Career
Here...Even worse...
9 / 10
10
2nd to Go (He’s also Amazing)
Here...Nightmare Worsens
123,000
Here...CEO’s Apology Letter
“We are extremely sorry...”“While we’re improving Maps,
you can try alternatives... like
Bing, MapQuest and Waze, or
use Google or Nokia maps...”
13
Everyday Fails (Cont.)
14http://www.itsagadget.com/2012/09/apple-google-maps-ios-6.html
Missing Details
15
Squiggly Roads
http://www.fastcompany.com/3003446/apple-reportedly-fires-their-maps-man
16
http://news.yahoo.com/blogs/technology-blog/apple-ceo-apologizes-maps-recommends-google-instead-182143889.html
On the water
17http://machineslikeus.com/news/get-lost-apple-maps-road-nowhere
In the water
18
Missing water
19http://theamazingios6maps.tumblr.com/page/6
Water Turned into Beaches
20http://theamazingios6maps.tumblr.com/page/4
Melted Streets
21
http://www.crowdsourcing.org/images/resized//editorial_19902_780x0_proportion.jpg?1349379876
Social Media Mockery...
22
http://blogs.telegraph.co.uk/technology/micwright/100007771/apple-moronic-new-maps-this-is-turning-into-a-disaster/
Even Mocked by These Guys!
23
http://www.businessinsider.com/google-maps-apple-maps-2012-10
Impact to Sales?
24
In Fairness to those Involved
- Extreme Complexity - Unimaginably Large Scope - Highly Visible Mistakes - Google Had a Huge Head Start
http://img.photobucket.com/albums/v40/Dragonrider1227/chainsawsonfire.jpg
25
Could this have been Avoided?
Imminent DisasterSometimes it’s just better to grab a beer and watch...
26
I. More Smart Testers
This man, Harry Robinson, is a genius.
He helped lead testing for Google Maps.
IMO, he’d be a bargain to Apple at $1 million / year.
http://model-based-testing.info/2012/03/12/interview-with-harry-robinson/
27
II. Using Smart Test Design
6 browser choices
x 3 options x 2 options x 2 options x 2 options
x 4 options x 2 options x 3 options x 2 options
x 2 options = 13,824 possible tests...
...13,824 possible tests x 4 options x 4 options x 4 options
= 884,736 possible tests...
...884,736 possible tests x 5 optionsx 2 optionsx 2 optionsx 2 optionsx 2 optionsx 4 optionsx 2 optionsx 2 optionsx 2 optionsx 4 optionsx 2 optionsx 2 optionsx 2 options
72,477,573,120 possible tests
This single web page could be tested with
25
First, users input details of an application to be tested...
TM
28
What things Vary? / How?
Next, users create tests that will cover interactions of every valid pair of values in as few tests as possible.
(1) Browser = “Opera” tested with (2) View = “Satellite?” Covered.(1) Mode of Transport = “Walk” tested with (2) Show Photos = “Y”? Covered.(1) Avoid Toll Roads = “Y” tested with (2) Show Traffic = “Y (Live)” ? Covered.
(1) Browser = IE6 tested with (2) Distance in = KM and (3) Zoom in = “Y” ? That is a 3-way interaction. It might not be covered in these 35 tests. See next page.
29
Highest priority test set?
% Coverage by Number of Tests100%
90%
80%
70%
60%
50%
40%
30%
20%
10%
2 4 7 9 11 14 16 18 21 23 25 28
Every test plan has a finite number of valid combinations of parameter values (involving, in this case, 2 parameter values). The chart below shows, at each point in the test plan, what percentage of the total possible number of relevant combinations have been covered.
In this set of test cases, as in most, there is a significant decreasing marginal return.
30
Highest priority 10 tests 20?
Testing each feature to “see if it works” is not enough.
32
Every pair of test inputs get tested in at least one test!
33
ThreeImplications
34
1. Bad software
quality can bring disaster to
anyone.
35
2.Smart, skilled,
empowered testers are essential.
36
3.Pairwise and
combinatorial testing helps test systems
BOTH more thoroughly
37
3.Pairwise and
combinatorial testing helps test systems
BOTH more thoroughly AND more quickly.
38
1) Harry Robinson testing
2) Pairwise testing case studies
For more info, Google / Bing:
39
1. Does this Stuff Really Work?
2. Adoption Trends
II. Making it Real
40
Why Pairwise?
41
How do Tools Design Tests?
They don’t.
You do!
42
Endemic Problems
1. Repetition
2. Gaps in Coverage
43
Pilots, Drivers, & Testers
“Look at the person to your left.
Look at the person to your right.”
You’re, no doubt, fine. It’s them.
44
Pilots, Drivers, & Testers
“Look at the person to your left.
Look at the person to your right.”
You’re OK. It’s them.
45
Things that Vary
How they Vary
46
A
09
I
C
A
V
B
10
R
E
B
I
C
L
C
B
A
Variables
A
09
I
C
A
V
B
10
R
E
B
I
C
L
C
B
A
A
09
I
C
A
V
B
10
R
E
B
I
C
L
C
B
A
After 5Hexawise
Tests
A
09
I
C
A
V
B
10
R
E
B
I
C
L
C
B
A
After 10Hexawise
Tests
A
09
I
C
A
V
B
10
R
E
B
I
C
L
C
B
A
After 13Hexawise
Tests
A
09
I
C
A
V
B
10
R
E
B
I
C
L
C
B
A
I
C
V
B
10
E
B
I
C
L
B
C
A
After 13ManualTests
AA
R
09
There were many, many pairs of values (in red) that the 13 manual tests had not tested together.
Manual test case selection
Without Hexawise With Hexawise
126 testsincomplete coveragewasteful repetition
13 testscomplete coverage
variation, not repetition
54
So What?
Time
Here...Adoption Trends
56
Source: Conservatively interpreted data from several dozen recent pilot projects. Time savings are often significantly larger than 40% and will almost always exceed 30%.
Faster Test Creation
57
More Defects Found / Hour
Source: Empirical study of average benefits 10 software testing projects published in IEEE Computer magazine in 2009: “Combinatorial Software Testing” Rick Kuhn, Raghu Kacker, Yu Lei, Justin Hunter. Results of individual projects will differ.
58
Faster Test Creation
Source: Empirical study of average benefits 10 software testing projects published in IEEE Computer magazine in 2009: “Combinatorial Software Testing” Rick Kuhn, Raghu Kacker, Yu Lei, Justin Hunter. Results of individual projects will differ.
59
Thank You!
(BTW did this topic make your “Top 3” list?)