Software testing McGill ECSE 321 Intro to Software Engineering Radu Negulescu Fall 2003
Sep 07, 2014
Software testing
McGill ECSE 321Intro to Software Engineering
Radu Negulescu
Fall 2003
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 2
About this module
Targeted tests are more efficient than random tests.
Here we discuss
• Testing concepts
• Unit testing
• Integration testing
• System testing
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 3
Example
A program that reads three sides of a triangle and determines whether the triangle is scalene, isosceles, or equilateral
• [Myers–The Art of Software Testing]
• How many test cases are necessary?
���������������� ��������� ��������� ���������� ��������� ���������� ��������� ���������� ��������� �
�� ��� �������� ��� �������� ��� �������� ��� ������
����������������������������������������������������������������������������������������
����������������������������������������������������������������������������������������
����������������������������������������������������������������������������������������
�� ���� �������� ���� �������� ���� �������� ���� ������
�! ����������! ����������! ����������! �����������"�����"�����"�����"���� ## $ ���������� ## $ ���������� ## $ ���������� ## $ ������������������������������ ## $ ���������� ## $ ���������� ## $ ���������� ## $ �����������%&�����%&�����%&�����%&����� ��� ��� ��� �
"'���(�����"'���(�����"'���(�����"'���(���������������������������������)���� ������)��)���� ������)��)���� ������)��)���� ������)��
* ���� �* ���� �* ���� �* ���� �
"'���(�����"'���(�����"'���(�����"'���(���������������������������������)���� !����)��)���� !����)��)���� !����)��)���� !����)��
****
****
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 4
Testing
Definitions
• The process of running a program on a set of test cases [Liskov]
• Finding differences between the system and its models (specs) [BD]
• Executing an implementation with test data and examining the outputs and operational behavior vs. requirements [Somm]
Importance
• Manifest faultsDifferent from manual checks
• Can be automated
• Decide completion of a projectSign-off
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 5
Types of testing
Different purposes
• Fault (defect) testing: discover faults by comparing to spec
• Regression testing: ensure that new faults have not been introduced
• Statistical testing: measure average-case parameters on operational profile
• Usability testing: measure ease of use
• Performance testing: determine performance
• User acceptance testing: system sign-off
• ...
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 6
Types of testing
Different levels of integration
• Unit testingTest atomic units (classes, routines) in isolation from other units
• Integration testingTest the interoperability of several units
• System testingTest the system as a whole
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 7
Types of testing
Different levels of knowledge / different coverage criteria
• Blackbox testingFocus on input/output behavior
Equivalence testingBoundary testing
• Structural testingUse information regarding internal structure
Path testingDefine-use testingState-based testing
A.k.a. white-box, glass-box, clear-box (“see inside the box”)
Different conditions of application
• Alpha testing
• Beta testing
• …
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 8
Test cases
Testing is organized around test cases
• Test case = input data + expected result
Elements of a test case
• Input
• OracleExpected outputInputs and outputs may alternate in a test pattern
• Identification (name)Derived from component or set of components under test
E.g. Test_A to test component A in isolationE.g. Test_AB to test components A and B together
• Various logisticsThe actual output record, or records, or place holder for itInstructions of how to apply the test (e.g. location of test case program)
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 9
Test cases
Example: System test case for a coffee vending machine
SysTestMakeCapuccinopressSelectionButton(“Capuccino”)display(“Capuccino”)insert(1.00)display(“1.00”)insert(0.25)display(“1.25”)makeCoffee(“Capuccino”)display(“making”)display(“Thanks”)
Input
Sequence of coin inserts is fully specified
Output
Output to hardware driver
Name should facilitate test maintenance
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 10
Test cases
A test case may aggregate several finer-grained test cases
• E.g. a regression test contains several individual tests
• A.k.a. “test suite”
Test cases may have precedence associations
• E.g. test units before testing subsystems
• The fewer the better
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 11
Test harnesses
A test harness includes test drivers, stubs, test case files, and other tools to support test execution
• Program or script
Implementation under test (IUT)
• System, component, or combination of components being tested
• Messages need to be sent between the IUT and its environment (actors or other parts of the system)
• Need to create substitutes for the environment of the IUT
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 12
Test stubs and drivers
A test driver calls the IUT services
• Pass inputs to the IUTTest data generators
• Display and/or check results
A test stub simulates parts of the system that are called by the IUT
• Provides the same API as the real components
• Returns a value compliant with the result type
• E.g. application of the bridge pattern
User Database interface
DatabaseTest stub
DatabaseimplementationInterface
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 13
Test harnesses
Example harness structures
TestDriver
testCase001()testCase002()testCase003()...
ClassUnderTest
TestDriver
runTestCases()
TestCase
run(CUT)
*
1
1 1
TestStub
TestStub
CUT
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 14
Testing activities
Testing activities
• Planning (selecting) test casesReviewing the test plan
• Test harness development
• Applying (running, re-running) the tests
Tradeoff between cost and completeness
• “Good” results: accuracy, coverageCannot cover all possible inputs (exhaustive testing)
• Time and budget limitations
Coverage metrics
• Modules / collaborations / subsystems
• Routines
• Statements / code
• Define-use combinations
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 15
Fault testing
The purpose of fault testing is to find faults (defects)
• Make the system fail (manifest failures)
• Locate the faults that caused the failure
Compare to manufacturing tests for circuits:
• Coverage of faults introduced by fabrication process
Levels of fault testing
• System testing: based on requirements
• Integration testing: based on design / module associations
• Unit testing: based on module interfaces and internals
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 16
Fault testing
Misconception: “My code is fully tested, therefore it has no bugs.”
Reality: “Program testing can be used to show the presence of bugs, but never to show their absence.” [Edsger Dijkstra]
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 17
Fault testing
Edsger Wybe Dijkstra 1930-2002
• A pioneer of structured and disciplined programming
• “Goto considered harmful”
• Key contributions to concurrency, system design, formal methods,software engineering, ...
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 18
Unit testing
Testing a component in isolation from others
• Focus on small unitsSystem units vs. project units
• Reduce the scope of fault finding
• Reduce the impact of changes entailed by fixing the fault
• Allows parallelism: independent testing of each component
Test case design techniques
• Black-boxEquivalence testingBoundary testing
• StructuralPath testingDataflow testingState-based testingPolymorphic bindings...
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 19
Equivalence testing
Select at least one test input from each equivalence class
Example (modified from [BD]): getNumDaysInMonth(int month, int year)
• Equivalence classes for each input3 classes of year: leap, non-leap, <= 0 (off range)5 classes of month: <= 0; 30-day; 31-day; February; >12
• Equivalence classes for combinations of inputsTotal 3x5 = 15 classesIncluding (leap, 30-day) and (non-leap, 30-day)
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 20
Black-box testing
Partition the input data into equivalence classes
• Class = set of combinations of inputs
• Cover all possible input
• Disjoint classes
• Invalid inputs may also be regarded as separate classesSlightly different twist from B&D
class 1class 3
class 2
class 4class 5 class 6
all inputs
valid inputs
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 21
Input classes
Valid data
• By input ranges or values E.g. long string, short stringE.g. integers: [-32768..-1] [0] [1..32767] [off-range pos] [off-range neg]
• By term (conjunct, disjunct) of the preconditionE.g. (a>=0) or (b=2) generates three valid classes
• By output valuesE.g. input data that will generate positive outputE.g. input data that will generate output out of representable rangeE.g. input levels that would cause the alarm to sound
• By values of the object dataE.g. default password
• By user-supplied commands, mouse picksOne class for each possible command
• By physical parameters
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 22
Input classes
Invalid data
• Data that invalidates the precondition
• Data outside the represented range
• Physically impossible dataE.g. temperatures less than 0 Kelvin
Valid vs. invalid data
• If a precondition requires a range (e.g. year between 1 and 9999), then one valid and two invalid classes are defined
• If a precondition requires a set (e.g. month name), then one valid and one invalid classes are defined
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 23
Boundary testing
Focuses on boundaries between equivalence classes
• Composite boundaries: corners
• “Bugs lurk in corners and congregate at boundaries.” – Boris Beizer
Example: getNumDaysInMonth(int month, int year)
• Test values 0, 1, 12, 13 for the month parameter
• Test values 0, 1, 1999, 2000, 2001 for the year parameter
Boundaries can be seen as classes on their own
• Which have boundariesAnd so on
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 24
Equivalence and boundary partitioning
Limitation
• The number of classes grows exponentially with the number and complexity of inputs
What to do if there are too many equivalence or boundary classes
• Limit the number of criteria for partitioning at each level of testMostly used in unit testing
• Cover individual classes but not all combinations of classesExample: classes to be considered to check validity of credit card expiry dates
Credit card Monthexpiry dates 00 01 02-11 12 13 large 99Year <=98 00/98 01/98 03/98 12/98 13/98 65/98 99/98
99 00/99 01/99 03/99 12/99 13/99 65/99 99/9900 00/00 01/00 03/00 12/00 13/00 65/00 99/0001 00/01 01/01 03/01 12/01 13/01 65/01 99/01>=02 00/08 01/08 03/08 12/08 13/08 65/08 99/0850 00/50 01/50 03/50 12/50 13/50 65/50 99/50
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 25
Path testing
Popular coverage criteria
• Statement coverage: all or almost all statements
• Branch coverage: all or almost all condition outcomes
• Path coverage: all or almost all possible execution paths
Determine a set of test cases that produce good coverage
• Main test (“common case”, “best test”, etc.)
• Targeted tests
Testing tools
• Coverage
• Test derivation
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 26
Path testing
Example: count words in a sentence
public int wordCount(String sentence) {boolean inWord = false;int currentCount = 0;for (int i = 0; i < sentence.length(); i ++) {
if (sentence.charAt(i) != ' ') {if (! inWord) {
currentCount ++;inWord = true;
}} else {
inWord = false;}
}return currentCount;
}
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 27
Cyclomatic complexity
A measure of the complexity of control flow
• Flow graphNodes represent statements, blocks, or condition termsExtra nodes: start, stopEdges represent flow of control
• Cyclomatic complexity = number of edges - number of nodes + 2
• CC gives an upper bound on the minimum number of tests necessary for complete branch coverage
Calculate CC by counting keywords
• Count 1 for each of if, and, or, for, while, case (or equivalents)
• Add 1
• Subtract 1 for each switch statement without a default clause
• Why/how does this work?
Variation: one node for each decision instead of condition term
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 28
Cyclomatic complexity
Example [after BD] (contains several faults on purpose):
public static int getNumDaysInMonth(int month, int year)throws MonthOutOfBounds, YearOutOfBounds {
int numDays;if (year < 1) {
throw new YearOutOfBounds(year);}if (month == 1 || month == 3 || month == 5 || month == 7 ||
month == 10 || month == 12) {numDays = 32;
} else if (month == 4 || month == 6 || month == 9 ||month == 11) {
numDays = 30;} else if (month == 2) {
if (isLeapYear(year)) {numDays = 29;
} else {numDays = 28;
}} else {
throw new MonthOutOfBounds(month);}return numDays;
}
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 29
Cyclomatic complexity
Example: flow graph (condition-based)
year < 1
n=32
throw2 n=29
return
throw1
n=28
n=30
month == 2 leap(year)
month= 1 3 5 7 10 12
month= 4 6 9 11
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 30
Cyclomatic complexity
Example: flow graph [BD] (statement-based)
[year < 1]
[month in (1,3,5,7,10,12)]n=32
throw2 n=29
return
throw1
n=28
n=30[month in (4,6,9,11)]
[month == 2] [leap(year)]
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 31
Path testing
Limitations
• Shorter methods, polymorphism in object-oriented programsIntegration testing becomes important for object-oriented testing
• Faults associated with violating invariants of data structuresE.g. array out of bounds
• Corner and boundary casesE.g. zero iterations of a loop
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 32
Data-flow testing
Data states
• Defined: initialized, but not evaluated yetDeclared: pointer allocated, but no valid value yet
• Used: value evaluated
• Killed: pointer freedIn Java, killed = out of context
A use chain contains
• A statement where a variable is defined (assigned a value)
• A statement where that variable is used with that definition active
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 33
Data-flow testing
Define-use testing strategy:
• Each use chain must be tested at least once
• Guard against faults where the wrong value is used
• Usually results in more tests than complete condition coverageExample:
if (cond1) {x = 5;
} else {x = 6;
}if (cond2) {
y = x + 1;} else {
y = x - 1;}
All condition outcomes: 2All use chains: 4
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 34
Testing objects
Challenge: a lot goes on behind the scenes
• More methods, less code in each method
• Inherited methods need to be retestedEven if the method code hasn’t changedUse local overridden methods and fields
• Polymorphism may introduce inconsistencies
• Some methods affect the internal state (fields) of an object but do not return a value
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 35
Testing objects
Polymorphism
• All possible bindings should be identified and testedMethod overridingParameter inheritance
• If that is infeasible, at least test every method call and every parameter type
A –foo(X) DA –foo(Y) D…C –foo(Z) F
A
B C
D
E F
foo(X) X
Y Z
... ......
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 36
State-based testing
Observe not just return values but also the state of the object
• Instrumentation of class attributes, to see their values
• Test each transition from each stateUse a sequence of inputs/messages to put the object in the desired stateUse a representative set of stimuli for each transition from that state
• Example:
MeasureTime SetTime
3.pressButtonsLAndR
5.pressButtonsLAndR/beep
4.after 2 min.
DeadBattery
8.after 20 years7.after 20 years
1.
pressButtonLpressButtonR
2.pressButtonLpressButtonR
6.
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 37
Testing common faults
Target commonly encountered faults
• Use inspection checklists
Example: GUI testing
• Windows not opening properly
• Window cannot be resized, scrolled, or moved
• Window does not properly regenerate after overwriting
Generic examples
• Too little data (or no data):E.g. zero or one lines for a sorting routine
• Too much data
• The wrong size of dataE.g. integer input that exceeds the representable range
• Uninitialized data
• Compatibility with old data
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 38
Integration testing
Integration testing
• Focus on groups of components
• TradeoffTest coverage (more groups)Effort to build drivers and stubsEffort to locate a fault in a group
• Tradeoff impacted by the selection and ordering of groups
Why test several components together after we have tested each of them individually?
• Impossible to test all input combinationsTest stubs and drivers are only approximations of real component behaviors
• Interface mismatchesE.g. caller’s parameter list differs from callee’s
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 39
Big-bang integration testing
Perform system testing right after unit testingPros:
SimpleLow cost of test set-upCan detect a few obvious failures quickly
Cons:Difficult to locate and fix each faultDifficult to detect more obscure failures
=> a more elaborate approach is needed
Bot
Target
Top
System
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 40
Layers
Top layer
• User/actor interface objects
• Close to user, visible
• Needs thorough testing
Bottom layer
• System drivers
• Communications
• Part of the output to user
Middle (target) layer
• Application objects
• Use services of bottom layer
• Offer services to top layer
Application logic
System drivers
User interface
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 41
Bottom-up testing
Integrates components from the bottom layer firstPros:
Fewer test stubs necessaryInterface faults tend to be more easily found
Cons:UI is tested last, and least
More likely to contain faults immediately visible to the user
Bot
Bot + TargetTarget
Top
System
Bot Bot + Target System
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 42
Top-down testing
Integrates components from the top layer firstPros:
No special test driversIt favors testing of user interface components
Cons:Test stubs are costly to create, and error prone; many are neededTest stub are imperfect
Top Top + Target System
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 43
Sandwich testing
Strategy
• Top layer and bottom layer are integrated incrementally with components of the target layer
• This can be done separately for top and bottom layers
Pros
• No need to write drivers and stubs for top and bottom layers
• Proper emphasis on testing UI components
Cons
• Does not thoroughly test the target layer before integration
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 44
Modified sandwich testing
Strategy
• Individual layer testsTop layer with stubs for target layer componentsBottom layer with drivers replacing the target layerTarget layer with bottom layer stubs, and drivers replacing the top layer
• Top layer + target layer + stubs
• Target layer + bottom layer + drivers
• Reuse target layer test drivers and stubs
Pros
• Lots of parallelism
• Allows early testing of user interface components
• Saves time by more coverage with the same stubs
Cons
• Many stubs and drivers needed to be developed
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 45
(Modified) sandwich testing
Sandwich:
Modified:
Bot Bot + Target
Top Top + Target
System
BotBot + Target
Top
Top + Target
SystemTarget
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 46
System testing
Types of system testing
• Functional testing
• Non-functional (performance) testing
• Pilot testing
• Acceptance testing
• Installation testing
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 47
Functional system testing
Interactive test cases
• Exact input and output should still be specified
Test cases selected from the use case model
• Coverage targeted to use casesKnows nothing about design – complementary to unit and integration testingBut can guess at future design possibilities – likely to uncover faults
Focuses on mismatches between system and specification
• Does not test the appropriateness of specificationNo need to prototype
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 48
Black-box techniques in system testing
Incomplete coverage of equivalence classes or boundaries
• Common and exceptional use cases:Common usage scenario: equivalence testingExceptional conditions: boundary testing
• For each use case:Principal flow (common usage)Secondary flows (exceptions)
Example: e-mail program
• Common usage use cases: send/receive
• Exceptional condition use cases: network down
• Receive use case principal flow: download mail, select message, ...
• Receive use case secondary flow: close program and recovery
• Incomplete coverage: not all combinations of user & network actions
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 49
Structural techniques in system testing
Target elements of a use case flow
• States, transitions
• Similar to object testing
• Focus on the interface only; ignore the internals
Example [BD, pp. 360-361]
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 50
Non-functional testing
Stress testing: many simultaneous requests
• E.g. number of simultaneous accesses
Volume testing: large sizes
• Large data sizes
• Unusually complex data
• Hard disk fragmentation
Timing testing:
• Latency, throughput, averages, ...
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 51
Non-functional testing
Security testing:
• “Tiger teams”
• Formally defined privacy properties
Recovery testing: recover from errors, resource or system failures
• System perturbers (memory occupation, etc.)
• Consistency of data after hardware or network failures
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 52
User tests
Pilot testing
• Use field data, exercised by users
• Alpha tests: in the development environment
• Beta tests: a limited number of end users
• Not very thorough because they are non-systematic
Acceptance testing
• Benchmark tests: typical conditions
• Competitor and shadow tests: back-to-back
Installation testing
• Ease of reconfiguration from the development environment to the target environment
• Repeats the test cases from function and performance testing
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 53
Usability testing
Tests the software specification (SRS, especially use case model) against the users’ expectations
• Does the user’s mental model match the actual system image?
• Let the user explore the system or part of the system
• Collect statistics on the user’s responseTime to complete a taskRate of user errorsEase of learning
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 54
Levels of usability testing
Scenario tests
• Present the users with a visionary scenario
• Watch for reactions: understanding, match to the work flow, subjective opinions
Prototype tests
• Vertical: per use case (especially the critical use cases)
• Horizontal: user interface, high technical risk, etc
• Instrument the code to profile user actions and collect operational statistics
• Limitations: can be expensive; tempting to transform into system
Product tests
• “Dog and pony show”
• Focus on validation
• Limitation: needs the product to be developed first
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 55
Collecting usage statistics
Example design using Observers and event passing:
1 1
listenerState = crtEvent.getState();history.collectEvent(crtEvent);
create new Event e;for each registered Listener l
l.handle(e);
Source
register(Listener)unregister(Listener)cast(Event)
ConcreteSourcesubjectStatecast(Event)
Listener
handle(Event)
ConcreteListenerlistenerStatehandle(Event)
EventcopyStategetState()
crtEvent
*
1
1
*
1 *
ConcreteStatisticsmeasurecomputeMeasure()getMeasure()
1
*
history
Statistics
collectEvent(Event)deleteEvent(Event)
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 56
Usability testing
Usability testing includes not just test design, but also:
• Development of test objectives
• Selection of a representative sample of users
• Use of the system in work environment
• Debriefing: interrogation and probing of users
• Collection of quantitative data and recommendations of improvement
• Interpretation and aggregation of raw dataUsability metrics
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 57
Managing testing
Develop test cases (plan tests) as early as possible
• SRS – system tests
• DD – integration tests & black-box unit tests
• Code – structural unit tests
• E.g. V-model of software life cyclesPro: an opportunity to think of likely faults, leading to fault preventionCon: maintenance of test cases, drivers and stubs
Parallelize tests
• E.g. [BD p. 364]
• Test batches may need to wait for defect fixes
• Start each as early as possible
• Few dependencies between test cases
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 58
Managing testing
Test documentation
• Test plan [B&D, p. 365]
• Test case specifications [B&D, p. 366]Each test case is a fully-detailed scenarioIncludes inputs and expected outputs
• Reports of test incidence and failures discovered
Responsibilities
• Separate QA team (up to 1 tester/developer)
• Swap development and testing roles
McGill University ECSE 321 © 2003 Radu Negulescu Introduction to Software Engineering Software testing—Slide 59
References
Unit, integration, and system testing:
• BD: 9.1-9.3, 9.4.2-9.4.4, 9.5
• McConnell: ch. 25
• Sommerville: ch. 20
Usability testing:
• BD: 4.5.3
• Sommerville: 15.5