Top Banner
Automated Generation of Context-Aware Tests by Zhimin Wang (University of Nebraska– Lincoln) Sebastian Elbaum (University of Nebraska– Lincoln) David S. Rosenblum (University College London) Research funded in part by the EPSRC and the Royal Society
21

Automated Generation of Context-Aware Tests

Jan 18, 2016

Download

Documents

haileyb haileyb

Automated Generation of Context-Aware Tests. by Zhimin Wang (University of Nebraska–Lincoln) Sebastian Elbaum (University of Nebraska–Lincoln) David S. Rosenblum (University College London). Research funded in part by the EPSRC and the Royal Society. Background Ubiquitous Computing Systems. - PowerPoint PPT Presentation
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Automated Generation of Context-Aware Tests

Automated Generation of Context-Aware Tests

by

Zhimin Wang (University of Nebraska–Lincoln)

Sebastian Elbaum (University of Nebraska–Lincoln)

David S. Rosenblum (University College London)

Research funded in part by the EPSRC and the Royal Society

Page 2: Automated Generation of Context-Aware Tests

BackgroundUbiquitous Computing Systems

• Context-Aware– Execution driven by changes

in execution environment• Sensed through middleware

invocation of context handlersContext is an additional input space that

must be explored adequately during testing

• Adaptive– Execution must adapt to context changesChanges to multiple context variables

may occur simultaneously

An important emerging class of software systems

Nearby device IDs

LocationRadio signal

strength

Available memory

Battery level

Page 3: Automated Generation of Context-Aware Tests

Problem StatementTesting Ubiquitous Systems

Discovering concurrencyfaults in context-awareubiquitous systems

• Failures occur frequentlyduring attempts to handlemultiple context changes

• Existing testing techniques havelimited effectiveness in discoveringthe underlying faults

New SMS Found Wi-Fi

Page 4: Automated Generation of Context-Aware Tests

Additional Challenges during Testing

• Hard to control when and how to input contexts– Middleware can introduce noise– Interference can occur between context handlers

• Hard to define a precise oracle– Execution differs under various vectors of context inputs

• Real environment is not available– Too many sensors are required

• Sensed contexts can be inconsistent– Example: The border between rooms at an exhibition

Page 5: Automated Generation of Context-Aware Tests

Contributions

1. CAPPs: Context-Aware Program Points• A model of how context affects program execution

2. CAPPs-Based Test Adequacy Criteria• Criteria for evaluating test suite effectiveness• Defined in terms of sets of test drivers

• Sequence of CAPPs to cover

3. CAPPs-Driven Test Suite Enhancement• Automated exploration of variant interleavings of

invocations of context handlers• Schedules interleavings via special instrumentation

Page 6: Automated Generation of Context-Aware Tests

Application for Case StudyTourApp Released with the Context Toolkit

PDA

CommunicationMiddleware

DemoWidget 1

RegistrationWidget

Sensor Sensor

RegistrationBooth

Room 1

InterpreterWidget

TourApp

Remote dataconnection

Service …

Visitor

Application

EndWidget

Sensor

Exit Room

DemoWidget 2

Sensor

Room 2

Service

Tag

Page 7: Automated Generation of Context-Aware Tests

Application for Case StudyTourApp Released with the Context Toolkit

PDA

CommunicationMiddleware

DemoWidget 1

RegistrationWidget

Sensor Sensor

RegistrationBooth

Room 1

InterpreterWidget

TourApp

Remote dataconnection

Service …

Visitor

Application

EndWidget

Sensor

Exit Room

DemoWidget 2

Sensor

Room 2

Service

Tag• Location:

Registration Room

• Application Response:Pop-Up Registration Form

Page 8: Automated Generation of Context-Aware Tests

Application for Case StudyTourApp Released with the Context Toolkit

PDA

CommunicationMiddleware

DemoWidget 1

RegistrationWidget

Sensor Sensor

RegistrationBooth

Room 1

InterpreterWidget

TourApp

Remote dataconnection

Service …

Visitor

Application

EndWidget

Sensor

Exit Room

DemoWidget 2

Sensor

Room 2

Service

Tag• Location:

DemoRoom 1

• Application Response:Display Lecture Information

Page 9: Automated Generation of Context-Aware Tests

Application for Case StudyTourApp Released with the Context Toolkit

PDA

CommunicationMiddleware

DemoWidget 1

RegistrationWidget

Sensor Sensor

RegistrationBooth

Room 1

InterpreterWidget

TourApp

Remote dataconnection

Service …

Visitor

Application

EndWidget

Sensor

Exit Room

DemoWidget 2

Sensor

Room 2

Service

Tag• Power:

LowBattery!

• Application Response:Confine Display Updates

Page 10: Automated Generation of Context-Aware Tests

Overview of Testing Infrastructure

CAPPs Identifier

Program Instrumentor

Context Manipulator

Context Driver Generator

Context-Aware Program (P)

AnnotatedFlow Graph

Test Suite (T)

Selected ContextAdequacy Criteria

P

Achieved CoverageAnd

Test Case ExtensionFeedback on Coverage

Test Drivers (D)

Page 11: Automated Generation of Context-Aware Tests

Test Adequacy CriteriaContext Adequacy (CA)

• Test driver covering at least one CAPP in each type of context handler

• Examples– { capp1, capp2 }

– or { capp3, capp2 }

– or { capp2, capp1 }

11

12

16

13 19

20

21

22

23

Exit

Exit

11

14

16

15 26

27

32

Exit

Exit

28

31

29

30

Handler CFGtype="power"

Handler CFGtype="demo"

capp1

capp2

capp3

capp4

capp5

capp6

Page 12: Automated Generation of Context-Aware Tests

Test Adequacy CriteriaSwitch-to-Context Adequacy (StoC-k)

• Set of test drivers covering all possible combinations of k switches between context handlers

• StoC-1 Example:– { capp1, capp2 },

{ capp5, capp3 },

{ capp3, capp3 },

{ capp5, capp5 }

11

12

16

13 19

20

21

22

23

Exit

Exit

11

14

16

15 26

27

32

Exit

Exit

28

31

29

30

Handler CFGtype="power"

Handler CFGtype="demo"

capp1

capp2

capp3

capp4

capp5

capp6

Page 13: Automated Generation of Context-Aware Tests

Test Adequacy CriteriaSwitch-with-Preempted-Capp Adequacy(StoC-k-FromCapp)

11

12

16

13 19

20

21

22

23

Exit

Exit

11

14

16

15 26

27

32

Exit

Exit

28

31

29

30

Handler CFGtype="power"

Handler CFGtype="demo"

capp1

capp2

capp3

capp4

capp5

capp6

• Set of test drivers covering all possible combinationsof k switches between context handlers, with each switch exercised at every CAPP

• StoC-1-FromCapp Example:– { capp1, capp2 },

{ capp3, capp2 },

{ capp4, capp5 },

{ capp2, capp3 },

{ capp5, capp1 },

{ capp6, capp3 },

{ capp3, capp3 },

{ capp5, capp5 }

Page 14: Automated Generation of Context-Aware Tests

Case Study Design and SettingsTourApp

• 11 KLOC of Java, 4 seeded faults• Test suite of 36 end-to-end test cases• Executing test suite takes 10 minutes• Studied 4 versions:

– originalTourApp: unmodified originalmanipulatedTourApp: instrumented with calls to our

scheduler methods– delayShortTourApp: instrumented with 1–3 seconds

random delays (sleep())– delayLongTourApp: instrumented with 1–10 seconds

random delay (sleep())

Page 15: Automated Generation of Context-Aware Tests

Results: CostTimings with manipulatedTourApp

• Summary of study– Execution time increases with more demanding context

coverage criteria as more context scenarios are required

Page 16: Automated Generation of Context-Aware Tests

Results: FeasibilityDrivers in manipulatedTourApp

• Summary of study– Some test drivers were not realised in the application

within the set timeouts– Improvements in generation of D may be needed via

better flow-sensitive analysis

Page 17: Automated Generation of Context-Aware Tests

Results: EffectivenessPercentage of Contextual Coverageand Fault Detection

• Summary of study– Coverage decreases with more powerful criteria

• But only slightly for manipulatedTourApp

– manipulatedTourApp performs the best, especially with more powerful criteria

Page 18: Automated Generation of Context-Aware Tests

Related Work

• Deterministic testing of concurrent programs(Carver & Tai, Taylor et al.)– Concurrency intrinsic to program, not execution environment

• Metamorphic testing for context-aware applications(Tse et al.)– Oracles embodying metamorphic properties must be defined by

tester

• Data flow coverage criteria for context-aware applications(Lu et al.)– Does not support notion of CAPPs or manipulation of test

executions

• Random sleeps for test perturbation (Edelstein et al.)– Inferior to controlled scheduling of context switches

Page 19: Automated Generation of Context-Aware Tests

Conclusion

• Defined the CAPPs model of how context changes affect context-aware applications

• Defined test adequacy criteria in terms of this model

• Created an automated technique to guide test executions in a way that systematically explores many interesting context change scenarios

• Demonstrated the superiority of this technique in discovering concurrency faults

Page 20: Automated Generation of Context-Aware Tests

Future Work

• Investigate Additional Classes of Faults– User interface ‘faults’– Adaptation priority faults– Memory leaks

• ContextNotifier and TestingEmulator– Emulation infrastructure for testing– (Tools and libraries from vendors are pathetic!)

• Simulation-Driven Testing– Test case execution driven by mobility traces from

simulation runs

Page 21: Automated Generation of Context-Aware Tests

Questions?

http://www.ubival.org/