Top Banner
Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 1 Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization Mohammad Abdollahi Azgomi Department of Computer Engineering Iran University of Science and Technology Test and Proof (TAP’09) ETH Zurich, Switzerland, July 2-3, 2009
51

Mohammad Abdollahi Azgomi Department of Computer Engineering

Feb 24, 2016

Download

Documents

emile

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization. Mohammad Abdollahi Azgomi Department of Computer Engineering Iran University of Science and Technology Test and Proof (TAP’09) ETH Zurich, Switzerland, July 2-3, 2009. - PowerPoint PPT Presentation
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 1

IncorporatingHistorical Test Case Performance Data and Resource Constraints into

Test Case Prioritization

Mohammad Abdollahi AzgomiDepartment of Computer Engineering

Iran University of Science and Technology

Test and Proof (TAP’09)ETH Zurich, Switzerland, July 2-3, 2009

Page 2: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 2

Outline

Software Regression Testing, It’s Problem and techniques

Test Case Prioritization and It’s types

History-Based Test Case Prioritization

Kim and Porter History-Based Prioritization Approach and It’s Flaws

Proposed Equation to History-Based Prioritization

Evaluation

Conclusion and Future Work

Context and Motivation

Page 3: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 3

Motivation: Why HB Prioritization?

Local information are not sufficient for long-time effective regression test.

Not one-time activity but continuous processPrevent from lose important information.– Local program change information → once tested a

change and one chance to reveal faultPrevent from misinterpret results.– Do not consider regression test frequency on its

effectiveness severely limits results’ applicability Hold improvement opportunities.– Analyzing historical data reveal test dependencies

→ More reduce test suite Aggress to real test environment conditions.– Time and resource constraints in test environments

Page 4: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 4

Software Maintenance Problem

Software constantly modified– Bug fixes– Addition of functionality

After changes, regression testing – run test cases in test suite and provide more– Provides confidence modifications correct– Helps find (unintended) new faults

Large number of test cases – continues to grow– Weeks/months to run entire test suite– Many test cases are broken, obsolete or redundant– Costs high – ½ cost of maintenance

Page 5: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 5

Terminology - I

Regression fault: - A fault revealed by a test case that has previously passed

but no long passes.Test case: - A test-related item which contains the following

information: 1. A set of test inputs 2. Execution conditions 3. expected outputs.

Test suite: - A group of related tests that are associated with a

database, and are usually run together.Test requirement (TR): - Specific elements of software artifacts that must be

satisfied or covered (test goal).

Page 6: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 6

Terminology - II

Coverage: - For every test requirement tr in TR it would be covered

if and only if at least a test t in test set T satisfies tr.Test adequacy:- Given a set of test requirements TR for coverage

criterion C, we achieve test adequacy (test set T satisfies C coverage) if and only if for every test requirement tr in TR, there is at least one test t in T such that t satisfies tr.

Page 7: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 7

OutlineContext and Motivation

Software Regression Testing, It’s Problem and techniquesTest Case Prioritization and It’s types

History-Based Test Case Prioritization

Kim and Porter History-Based Prioritization Approach and It’s Flaws

Proposed Equation to History-Based Prioritization

Evaluation

Conclusion and Future Work

Page 8: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 8

Defining Regression Testing

Problem (Rothermel et al):– Given program P, its modified version P’, and a test set

T that was used to previously test P,– find a way to utilize T to gain sufficient confidence in

the correctness of P’Regression testing is software maintenance task performed on a modified program to instill confidence that:– changes are correct– have not adversely affected unchanged portions of the

program (does not “regress”)Rerunning test cases, which a program has previously executed, correctly in order to detect errors spawned by changes or corrections made during software development and maintenance

Page 9: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 9

Regression Testing Place

Page 10: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 10

Regression Test’s Problem

Old test cases rarely put aside.Evolving software, regression test and it’s costs grows.Re-running all test cases is costly and often infeasible due to time and resource constraints.– An industrial report: 7 weeks to rerun all test cases

for a product of about 20,000 LOC !Optionally (randomly) put aside some test cases severely threats software product’s validity.Regression test’s challenge: How to select an appropriate subset of the existing test suite each time regression test occurs in order to meet test goals in fastest rate possible?

Page 11: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 11

Regression Test’s Main Techniques

Retest all– Simply re-execute all tests – costly, often infeasible

Regression Test Selection – Selecting an appropriate subset of the existing test

suite, based on information about the program, modified version, and test suite → safety-cost trade off

Test Suite Reduction (Minimization)– Reducing a test suite to a minimal subset that

maintains equivalent coverage of the original test suite with respect to a particular test adequacy criterion → size reduction- fault loss trade off

Test Case Prioritization– Order test cases so that those test cases with the

highest priority, according to some criterion, are executed earlier → time and resource constraints

Page 12: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 12

OutlineContext and Motivation

Software Regression Testing, It’s Problem and techniques

Test Case Prioritization and It’s typesHistory-Based Test Case Prioritization

Kim and Porter History-Based Prioritization Approach and It’s Flaws

Proposed Equation to History-Based Prioritization

Evaluation

Conclusion and Future Work

Page 13: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 13

Test Case Prioritization ProblemGiven: T, a test suite, PT, the set of permutations of T, and , a function from PT to the real numbers,

Problem: Find T’ PT such that ( T”)(T” PT)(T” T’) [ (T’) (T”)].PT set of all possible prioritizations (orderings) of T and

is a function applies an award value to any such ordering.0/1 Knapsack Problem → NP-hard, intractable and without definite solution → All existing prioritization techniques are heuristics.

f

f

f f

Page 14: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 14

Test Case Prioritization Goals

Increase the rate of fault detection.

Increase the rate of detection of high-risk faults.

Increase the likelihood of revealing regression errors

related to specific code changes earlier.

Increase coverage of coverable code in the system

under test at a faster rate.

Increase confidence in the reliability of the system

under test at a faster rate.

Page 15: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 15

OutlineContext and Motivation

Software Regression Testing, It’s Problem and techniques

Test Case Prioritization and It’s types

History-Based Test Case PrioritizationKim and Porter History-Based Prioritization Approach and It’s Flaws

Proposed Equation to History-Based Prioritization

Evaluation

Conclusion and Future Work

Page 16: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 16

Prioritization Techniques’ Categories

Mostly based on code changes and tests execution profileWish to cover change parts of code at a fastest rate

Model Based

Prioritization

(MBP)

History Based

Prioritization

(HBP)

Code Based

Prioritization

(CBP)

Software or system specifications → model or formal description Prioritize based on information collected during model execution Model regression testing as ordered sequence of testing sessions Use historical tests’ execution data to estimate current order

Page 17: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 17

Non-HB Prioritization Techs’ Drawback

Consider regression testing as a one-time activity (memoryless) rather than a continuous and long-life process done each time code changes during maintenance.

Does not take real world time and resource constraints into consideration.

Do not consider the fact that Regression testing is modeled as an ordered sequence of testing sessions, each of whose performance may depend upon prior testing sessions, and each of which is subject to time and resource.

Page 18: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 18

OutlineContext and Motivation

Software Regression Testing, It’s Problem and techniques

Test Case Prioritization and It’s types

History-Based Test Case Prioritization

Kim and Porter History-Based Prioritization Approach and It’s FlawsProposed Equation to History-Based Prioritization

Evaluation

Conclusion and Future Work

Page 19: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 19

Kim and Porter HB Prioritization Approach

Each time regression test occurs:Step1: Select a subset T' from original set T of testsStep2: Calculate selection probability for each test case tc T' in

time t: Ptc,t(Htc,α) witch Htc : {h1,h2,…,ht} is:• Set of t time ordered observations• Drawn from previous executions of tc • Ptc,t(Htc,α) for each tc computes as follows: P0 = h1

Pk = α hk + (1 - α)Pk-1 k ≥1 , 0 ≤ α <1

Step3: Draw a test case from T‘ using the probabilities assigned in step 2, and run it,

Step4: Repeat step 3 until testing time is exhausted.

Calculated values are probabilities

Page 20: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 20

Test set T

Test set T’

Regression Test Selection

Associate selection probabilities to each

tc based on previous executions

data

Draw a tc with highest selection probability

Test Case Execution

Page 21: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 21

Different Test History Definitions

Execution history: For every testing session i, in which test case tc is executed, hi takes the value 0. Otherwise it takes the value.

Net effect: Cycle through all test cases over multiple testing sessions.Demonstrated fault detection effectiveness: For every testing session i, in which test case tc passed, hi takes the value 0, Otherwise it takes the value 1.

Net effect: 1. Limit the running of test cases that rarely, if ever, reveal faults. 2. Test cases whose failures are related to unstable sections of code would continue to be selected until that code stabilizes.

Coverage of program entities: Give higher priority to test cases that cover functions infrequently covered in past testing sessions.

Net effect: Limit the possibility that any particular function goes unexercised for long periods of time constraints.

Page 22: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 22

Flaws of Kim and Porter HBP Approach

Using hk to determine selection probability, especially only with two values 0 and 1, and only based on the recent execution of each test case, is not an appropriate criterion to provide an execution history for the test cases.

Increasing test case selection probability only based on:– Whether or not a test case has been executed in recent execution

or – If it has exposed fault in recent execution,

will not produce efficient ordering for test cases in history-based prioritization.

We should not take the number of regression test sessions the test case executed in (ec) and the number of test sessions it reveals fault(s) (fc) separately. But also these two factors together as a single factor, show test cases' historical performance (fc/ec).

Page 23: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 23

OutlineContext and Motivation

Software Regression Testing, It’s Problem and techniques

Test Case Prioritization and It’s types

History-Based Test Case Prioritization

Kim and Porter History-Based Prioritization Approach and It’s Flaws

Proposed Equation to History-Based PrioritizationEvaluation

Conclusion and Future Work

Page 24: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 24

Solution: Proposed HBP Equation

Determine priority of each test case tc in each regression test session based on three factors:1. Historical fault detection effectiveness of tc during past test sessions

2. Execution history (period in which tc has not been executed)

3. Previous priority of tc

Net effect of considering these factors in tc’s history based priority:– Corroborate test case priority based on it’s demonstrated effectiveness

with respect to fault detection.

and– Cycle through all test cases during regression test long runs and prevent

from obsolesce of any test case in test suite.

Page 25: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 25

1. Historical Fault detection Effectiveness

Executing a test case → It’s priority have been weakened in next regression test session.

More effective test cases with respect to fault detection should lie among executive test cases faster than other ones.

Historical fault detection effectiveness factor is more corroborative for priority of those test cases which are more effective with respect to fault detection during past test sessions.

Fault detection effectiveness

Nom of tc’s fault detection

Nom of tc’s execution

Test case

0.2 4 20 1

0.25 3 12 2

Page 26: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 26

1. Historical Fault detection Effectiveness

In k th execution of regression test (software has been modified k times up to now):

fck - the number of times that the execution of test case tc fails (reveals fault(s)).

eck - the number of tc executions up to now.

Relation between each test case's priority with its fault detection performance in kth execution is as follows:

kk HFDEPR

k

kk

ecfcHFDE0

1

1

k

iik ffc

1

1

k

iik eec

01

if

01

ie

if tc has revealed fault(s) in test session i

Otherwise

if tc has been executed in test session i

Otherwise

Page 27: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 27

2. Execution History (Period of not execution)

In context of OS and process scheduling → problem of process starvation.

Starvation : During the execution of various processes in a system, a process has not been selected for execution for a long time.

Solution: Job scheduler → assigns a counter to each process → takes number of times the process has not been executed →help to increases tc’s priority.

Similar to this idea:– Consider period of time a test case is not being executed.

Execution history factor is more corroborative for priority of those test cases which have not been executed for long time during past test sessions.

Page 28: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 28

35614872 34725861

Existing Test Cases

Executable Test cases

1

Revealer Test CaseFault

234

Page 29: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 29

2. Execution History (Period of not execution)

Relationship between test case priority and its execution history in the kth execution:

Each time a test case is not executed → it’s execution history will be increased by one.

Once the test case is executed → execution history becomes 0 and the operation is repeated as well.

Ensure that none of the test cases has remained unexecuted for a long time, and the corresponding faults will be revealed.

kk hPR 00 h

110

kk hh

if tc has been executed in test session k-1

Otherwise

Page 30: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 30

3. Previous Priority of Test Case

Reasons using this factor: – It causes to smoother selection of test cases in successive executions of

regression test → limits severe changes in selection of executing test cases in test suite in each run with respect to the previous run.

– In cases historical fault detection effectiveness of test cases and their execution history are the same → consider another factor to establish a proper priority between them.

Relationship between test case priority and its execution history in kth execution:

Previous priority factor is more corroborative for those test cases which have high priority in recent test session.

1 kk PRPR

Page 31: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 31

kk HFDEPR kk hPR 1 kk PRPR

kkkk hPRHFDEPR 1 k 11,,0

0PR Percentage of code coverage of the test case

110

kk hh

k

kk

ecfcHFDE0

Proposed HBP Equation

Calculated values are priorities

Page 32: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 32

Notes About Proposed HBP Equation

, and are smoothing constants → control the effect of mentioned factors in test case prioritization.

fck/eck is between 0 and 1 and PRk-1 is small real number often near to 1 → must control the effect of hk such that it do not mask other factors effect by mistake. must be smaller than , are so close to 0. It is preferable to set and values between 0.5 and 1.

Difference:– Pk in Kim and Porter HBP approach is selection probability of each test

case in kth execution.– PRk in proposed HBP equation is test case priority in kth execution.

Time and resource constraints → sufficient number of test cases will be executed beginning from highest priorities.

Page 33: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 33Process of proposed HBP approach

Page 34: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 34

OutlineContext and Motivation

Software Regression Testing, It’s Problem and techniques

Test Case Prioritization and It’s types

History-Based Test Case Prioritization

Kim and Porter History-Based Prioritization Approach and It’s Flaws

Proposed Equation to History-Based Prioritization

EvaluationConclusion and Future Work

Page 35: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 35

Evaluation of Proposed HBP Approach

Benchmark Programs: – 7 programs of Siemens suite and Space program.

Coverage Types and Tools:– Branch (decision) coverage– All-uses coverage: ATAC tool

Test Suites: – 1000 branch coverage adequate test suites (testplans-bigcov)

Faulty Versions: – 29 Siemens and Space multi-fault versions for 29 times regression test executions.

Evaluation Metric: APFD (with respect to fault detection rate).

Comparison: Compare with random ordering and Kim and Porter HBP approach.

Page 36: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 36

Benchmark Programs: Siemens and Space

7

10

32

9

10

41

23

print-tokens

print-tokens2

replace

schedule

schedule2

tcas

tot-info

Multi-fault versions

Program Name

35space

16

12

19

8

8

6

7

LOCAverage test suite size

402

438

516

299

297

138

346

6218155

Lexical analyzer

Lexical analyzer

Pattern recognition

Priority scheduler

Priority scheduler

Collision avoidance

Statistics computing

4130

4115

5542

2650

2710

1608

1052

DescriptionNom of test cases

ADL language interpreter13585

Page 37: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 37

Types of Test Suite Coverage in Experiments

Branch (decision) coverage: either True or False branches of a decision– if, if-else, while, do-while, for, switch and entry of functions with no decisions

All-uses coverage: all uses of a definition → ATAC tool

if (*j >= maxset){ fprintf(fp, "bT1,"); result = false; { else } fprintf(fp, "bF1,"); outset[*j] = c; *j = *j + 1;result = true; {

if (*j >= maxset) result = false;else } outset[*j] = c; *j = *j + 1;result = true; {

Branch instrumentation of a piece of replace program code

Page 38: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 38

Types of Test Suite Coverage in Experiments

Redundancy Number Coverage Type Test Suite Name

1000 Branch Coverage Adequate

testplans-bigcov

1000 Branch Coverage Adequate

testplans-cov

1000 Def-use Coverage Adequate

testplans-bigdu

1000 Def-use Coverage Adequate

testplans-du

1000 rand-covsize

Page 39: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 39

Faulty Versions of Programs

Siemens Programs and Space:– Siemens: hand-seeded faults– Space: real big program (11KLOC) and real faults

Creating Multi-fault versions:– Single fault versions have been created for each program– Set of multi-fault versions composed of non-interfering single faults– For each program 29 multi-fault version have been choosed for 29 runs of

regression test

1000Test Suites Prioritize all by random

ordering approach

Prioritize all by proposed approach

1000Prioritized Test Suites

1000Prioritized Test Suites

Compare test suites pair wise

In each regression test session

Page 40: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 40

APFD: Prioritization Techniques Comparison

Comparison of prioritization techniques with respect to fault detection: APFD metricAPFD: Average Percentage of Fault Detection in test suite lifetime

nnmTFTFTFAPFD m

21...21

n : Number of test casesm : Number of faultsTFi : First test case in ordered test suite reveals fault i

Page 41: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 41

Display Experiment Results: Boxplot Diagrams

Box plot Diagrams: Help to (1) statistically analyze results (2) observe any differences between experiments and (3) visualize the empirical results in test case prioritization studies. Create box plot diagrams: SAS 9.1.3 in experiments.

How interpret boxplot diagram? Lower Quartile of data → Q1

Median of data → Q2

Upper Quartile of data → Q3

Higher place of the box plot → faster prioritization technique reveals faults. Closer the box → more stability in prioritization technique’s behavior with respect to fault detection.

Q1→

Q3→

Q2→

Page 42: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 42

Exp 1: Proposed HBP Approach vs. Random Ordering

Experiments: Compare proposed approach versus random ordering with

respect to faster fault detection (APFD values for prioritizing 1000 test

suites)

For each Siemens programs and Space program

Percent of branch coverage as initial ordering of test cases

For each program 29 times regression test have been repeated (on 29

multi-fault versions)

Better Comparison:

A diagram for all regression test executions (all 29 runs) plotted.

Each same color boxplots compare proposed HBP approach versus

random ordering test case prioritization.

Page 43: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 43

Exp 1: Proposed HBP Approach vs. Random Ordering

For each program → Each same color boxesProposed HBP approach: Left box Random ordering approach: right box

Initial prioritization: percent of branch coverage of test cases (control-flow criteria) Time and Resource Constraints: Only 70% of prioritized suite executed

All-Versions Diagram

Page 44: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 44

Exp 2: Proposed HBP Approach vs. Random Ordering

(diff initial ordering criteria)

For each program → Each same color boxesProposed HBP approach: Left box Random ordering approach: right box

Initial prioritization: percent of all-uses coverage of test cases (data-flow criteria) Time and Resource Constraints: Only 70% of prioritized suite executed

All-Versions Diagram

Page 45: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 45

Exp 3: Proposed HBP Approach vs. Kim and Porter HBP Approach

For each program → Each same color boxesProposed HBP approach: Left box Kim and Porter HBP approach: right box

Initial prioritization: percent of branch coverage of test cases (control-flow criteria) Time and Resource Constraints: Only 30% of prioritized suite executed

All-Versions Diagram

Page 46: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 46

OutlineContext and Motivation

Software Regression Testing, It’s Problem and techniques

Test Case Prioritization and It’s types

History-Based Test Case Prioritization

Kim and Porter History-Based Prioritization Approach and It’s Flaws

Proposed Equation to History-Based Prioritization

Evaluation

Conclusion and Future Work

Page 47: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 47

Proposed HBP Approach Specifications

In the proposed HBP approach, three factors are effective to determine the

test case execution priority in each regression test session:

1. Historical demonstrated performance in fault detection during the

regression test lifeline

2. Priority test case in previous regression test session

3. Duration that each test case has not been executed

Proposed HBP approach directly uses these three factors in prioritization.

In each test session priority values are computed for all test cases.

According to resource and time constraints, test cases are executed

beginning from higher priorities as far as possible (solution for regression

test selection tech).

Page 48: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 48

Conclusions

Incorporating historical fault detection effectiveness , test case previous

priority and duration that each test case has not been executed, causes to

effective prioritizing test cases during continuous regression test runs.

Proposed HBP approach always performs significantly faster with respect to

fault detection versus both random ordering and Kim & Porter approach.

Proposed HBP approach’s results with respect to fault detection is

independent from initial prioritization code coverage criteria.

In more severe resource constraint conditions, the gap between performance

of the proposed HBP approach versus Kim & Porter HBP approach increases.

Page 49: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 49

Future Works

Considering fault severity and different test case costs (e.g. execution time) in

the proposed HBP approach.

More empirical studies by programs with real faults in order to study the

performance of this approach in real environments more precisely.

Further studies using available Java benchmarks in order to investigate the

proposed technique for object-oriented programs.

Investigate the proposed HBP approach on successive faulty versions of the

real softwares during regression testing (probably more interesting results).

Determining , and coefficients more precisely based on obtained

historical data.

Page 50: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 50

Main References of Study

[1] J. M. Kim and A. Porter, "A History-Based Test Prioritization Technique for Regression Testing in Resource Constrained Environment," in 24th International Conference on Software Engineering, 2002, pp. 119-129.[2] H. Park, H. Ryu, and J. Baik, "Historical Value-Based Approach for Cost-Cognizant Test Case Prioritization to Improve the Effectiveness of Regression Testing," in 2th International Conference on Secure System Integration and Reliability Improvement, Yokohama, Japan, 2008, pp. 39-46.[3] J. M. Kim, A. Porter, and G. Rothermel, "An empirical study of regression test application frequency," in 22nd lnternational Conference on Software Engineering, 2000, pp. 126-135. Also in Software Testing, Verification and Reliability, vol. 15, Iss. 4, pp. 257 – 279, 2005.[4] S. Elbaum, A. G. Malishevsky, and G. Rothermel, "Test Case Prioritization: A Family of Empirical Studies," IEEE Transactions on Software Engineering, vol. 28, No. 2, pp. 159-182, 2002.[5] G. Rothermel, R. H. Untch, C. Chu, and M. J. Harrold, "Prioritizing Test Cases for Regression Testing," 2001 IEEE Transactions on Software Engineering, pp. 102-112, 2001.[6] G. Rothermel, R. H. Untch, C. Chu, and M. J. Harrold, "Test case prioritization: an empirical study," in 1999 IEEE International Conference on Software Maintenanc, Oxford, England, 1999, pp.[7] I. Burnstein, Practical software testing: a process-oriented approach. New York: Springer-Verlag, 2003. [8] S. Elbaum, A. Malishevsky, and G. Rothermel, "Incorporating varying test costs and fault severities into test case prioritization," in 23rd International Conference on Software Engineering: IEEE Computer Society, Toronto, Canada, 2001, pp. 329-338.[9] G. Rothermel and M. Harrold, "A Safe, Efficient Regression Test Selection Technique," ACM Transactions on Software Engineering & Methodology, vol. 6, No. 2, pp. 173-210, 1997.[10] W. E. Wong, J. R. Horgan, A. P. Mathur, and A. Pasquini, "Test set size minimization and fault detection effectiveness: A case study in a space application," Journal of System and Software, vol. 48, pp. 79-89, 1999.

Page 51: Mohammad  Abdollahi  Azgomi Department of Computer Engineering

Incorporating Historical Test Case Performance Data and Resource Constraints into Test Case Prioritization M. Abdollahi Azgomi 51

Thank You!