Top Banner
How Do Testers Do It? Experience Based and Experience-Based and Exploratory Testing Ohjelmistojen testaus 15.11.2010 Juha Itkonen Sb SoberIT
51

Experience-Based and Exploratory Testing - Itkonen

Dec 17, 2015

Download

Documents

popoyboys

Experience-Based and Exploratory Testing - Itkonen
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
  • How Do Testers Do It?

    Experience Based andExperience-Based andExploratory Testingp y gOhjelmistojen testaus

    15.11.2010

    Juha Itkonen

    S bSoberIT

  • AgendaAgenda Intelligent Manual Testing

    E i b t ti Experience base testing Exploratory testing

    Ways of Exploring Ways of Exploring Session Based Test Management Touring testingg g

    Intelligent Manual Testing Practices Examples of empirically identified testing

    practices

    Benefits of Experience Based Testing

    2010Juha Itkonen - SoberIT

    2

  • Manual Testing Research has shown:1. Individual differences in Manual Testing

    T ti th t i f d b

    1. Individual differences in testing are high

    2. Test case design techniques alone do not

    l i th lt Testing that is performed by human testers

    explain the results

    Stereotype of manual testing Executing detailed pre-designed test Executing detailed pre-designed test

    cases Mechanically following the step-by-

    step instructions Treated as work that anybody can do

    In practice, its clear that some testers are better than others in manual testing and more effective at revealing

    2010Juha Itkonen - SoberIT

    3

    g gdefects...

    Image: Salvatore Vuono

  • Experience is invaluable in softwareExperience is invaluable in software testing Domain experience Domain experience

    Knowledge and skills gained in the application domain area How the system is used in practice, and by whom

    Wh t th l f th What are the goals of the users How is the system related to the customers (business) processes

    Technical system experience How the system was built What are the typical problems and defects How is the system used and all the details work How things work together and interact

    Testing experience Knowledge of testing methods and techniques g g Testing skills grown in practice

    Juha Itkonen, 2009 SoberIT

    4

  • Software testingSoftware testing

    is ti and l t ork is creative and exploratory work requires skills and knowledge

    application domain application domain users processes and objectives some level of technical details and history of the application so e e e o ec ca de a s a d s o y o e app ca o

    under test

    requires certain kind of attitude

    2010Juha Itkonen - SoberIT

    5

  • Testers AttitudeTester s Attitude

    People tend to see what they want or expect to see If you want to show that software works

    correctly, you will miss defects Testers goal is to break the softwareTester s goal is to break the software

    Reveal all relevant defects Find out any problems real users would

    experience in practiceexperience in practice Testing is all about exceptions, special cases,

    invalid inputs, error situations, and complicated unexpected combinations

    Photo by Arvind Balaraman

    2010Juha Itkonen - SoberIT

    6

  • Testers GoalTester s Goal

    Explore, investigate, and measure Provide quality related information for

    other stakeholders in useful form

    Testers attitude is destructive t d th ft d t t b ttowards the software under test, but highly constructive towards people

    Photo by Graur Codrin

    2010Juha Itkonen - SoberIT

    7

  • My viewpoint: Experience BasedMy viewpoint: Experience Based Intelligent Manual Testing Manual testing that builds on the testers experience Manual testing that builds on the tester s experience

    knowledge and skills

    Some aspects of testing rely on testers skills during testing e.g., input values, expected results, or interactions

    Testers are assumed to know what they are doing Testers are assumed to know what they are doing Testing does not mean executing detailed scripts

    Focus on the actual testing work in practice What happens during testing activities? How are defects actually found? Experience-based and exploratory aspects of software Experience-based and exploratory aspects of software

    testing

    2010Juha Itkonen - SoberIT

    8

  • Exploratory Testing is creative testingExploratory Testing is creative testing without predefined test casesBased on knowledge and skills of the testerg

    1. Tests are not defined in advance Exploring with a general mission without specific step-by-step instructions on how to accomplish the mission

    2. Testing is guided by the results of previously performed tests and the gained knowledge from themg g Testers can apply deductive reasoning to the test outcomes

    3. The focus is on finding defects by exploration Instead of demonstrating systematic coverageInstead of demonstrating systematic coverage

    4. Parallel learning of the system under test, test design, and test execution5. Experience and skills of an individual tester strongly affect

    effectiveness and resultseffectiveness and results

    2010Juha Itkonen - SoberIT

    9

  • Exploratory Testing vs. Scripted TestingExploratory Testing vs. Scripted Testing

    ET is an approach Most of the testing techniques can be used in exploratory way Exploratory testing and (automated) scripted testing are the ends

    of a continuum

    Freestyle exploratory Pure scripted( t t d) t ti

    y p ybug hunting (automated) testing

    Manual scripts

    High leveltest cases

    Chartered Manual scriptsCharteredexploratory testing

    Juha Itkonen, 2009 SoberIT

    10

  • Scripted vs. Exploratory TestingScripted vs. Exploratory Testing

    In scripted testing, tests are first p g,designed and recorded. Then they may be executed at some later time or by a different tester. Testsy Tests

    In exploratory testing, tests are designed and executed at the same time and they often are not

    Teststime, and they often are not recorded. A mental model of the product

    M d l f h t th d t i d h it P d t Model of what the product is and how it behaves, and how its supposed to behave

    Product

    Juha Itkonen, 2009 SoberIT

    11James Bach, Rapid Software Testing, 2002

  • Lateral thinkingLateral thinking

    Allowed to be distracted Find side paths and explore interesting areasp p g Periodically check your status against your mission

    Juha Itkonen, 2009 SoberIT

    12

  • Scripted vs. Exploratory TestsMine-field analogygy

    bugs fixes

    Juha Itkonen, 2009 SoberIT

    13

  • Two views of agile testingeXtreme Testing Automated unit testing

    D l it t t

    Exploratory Testing Utilizes professional testers skills

    and experience Developers write tests Test first development Daily builds with unit tests always

    100% pass

    and experience Optimized to find bugs Minimizing time spent on

    100% pass Functional (acceptance) testing

    Customer-ownedC h i

    documentation Continually adjusting plans, re-

    focusing on the most promising risk Comprehensive Repeatable Automatic

    areas Following hunches Freedom, flexibility and fun for

    Timely Public

    , ytesters

    Focus on automated verification Focus on manual validation enabling agile software

    developmentmaking testing activities

    agile

    14Juha ItkonenSoberIT

  • AgendaAgenda Intelligent Manual Testing

    E i b t ti Experience base testing Exploratory testing

    Ways of Exploring Ways of Exploring Session Based Test Management Touring testingg g

    Intelligent Manual Testing Practices Examples of empirically identified testing

    practices

    Benefits of Experience Based Testing

    2010Juha Itkonen - SoberIT

    15

  • Some ways of exploring in practiceSome ways of exploring in practice Freestyle exploratory

    i Session-based

    l itesting Unmanaged ET

    F ti l t ti f

    exploratory testing Exploring like a tourist

    Functional testing of individual features

    Exploring high level test Outsourced exploratory

    t ti Exploring high level test cases

    Exploratory regression

    testing Advanced users, strong

    domain knowledgeExploratory regression testing by verifying fixes or

    domain knowledge Beta testing

    changes

    2010Juha Itkonen - SoberIT

    16

  • Session Based Test Management (SBTM)Session Based Test Management (SBTM)Bach, J. "Session-Based Test Management", STQE, vol. 2, no. 6, 2000.http://www.satisfice.com/articles/sbtm.pdf

    CharterCharter Time Box Reviewable Result

    D b i fi Debriefing

    2010Juha Itkonen - SoberIT

    17

  • Session-Based TestingSession Based Testing a way to manage ET Enables planning and tracking exploratory testing Enables planning and tracking exploratory testing

    Without detailed test (case) designs Dividing testing work in small chunks Tracking testing work in time-boxed sessions

    Efficient no unnecessary documentation Agile its easy to focus testing to most important areas based on Agile it s easy to focus testing to most important areas based on

    the test results and other information Changes in requirements, increasing understanding, revealed

    problems, identified risks, Explicit, scheduled sessions can help getting testing done

    when resources are scarce When testers are not full-time testers...

    Juha Itkonen - SoberIT

    182010

  • Exploring like a touristExploring like a tourist a way to guide ET sessions

    Touring tests use tourist metaphor to guide testers actions

    Focus to intent rather than separate features This intent is communicated as tours in

    different districts of the software

    James A. Whittaker. Exploratory Software Testing, Tips, Tricks, Tours, and Techniques to Guide Test Design. Addison-Wesley, 2009.

    2010Juha Itkonen - SoberIT

    19

  • Districts and ToursDistricts and Tours Business district

    Guidebook tour Tourist district

    Collectors tour Money tour Landmark tour Intellectual tour

    Lonely businessman tour Supermodel tour TOGOF tour

    FedEx tour After-hours tour Garbage collectors tour

    Scottish pub tour Hotel district

    Rained-out tour Historical district

    Bad-Neighborhood tour Museum tour

    Coach potato tour Seedy district

    Saboteur tour Prior version tour

    Entertainment district Supporting actor tour

    Antisocial tour Obsessive-compulsive tour

    Supporting actor tour Back alley tour All-nighter tour

    2010Juha Itkonen - SoberIT

    20

    James A. Whittaker. Exploratory Software Testing, Tips, Tricks, Tours, and Techniques to Guide Test Design. Addison-Wesley, 2009.

  • Examples of exploratory testing toursExamples of exploratory testing toursThe Guidebook Tour Use user manual or other

    The Garbage Collectors Tour Choosing a goal and then visiting

    documentation as a guide Test rigorously by the guide Tests the details of important

    g g geach item by shortest path

    Screen-by-screen, dialog-by-dialog, feature-by-feature, Tests the details of important

    features Tests also the guide itself

    V i ti

    dialog, feature by feature, Test every corner of the software,

    but not very deeply in the detailsThe All Nighter Tour Variations

    Bloggers tour, use third party advice as guide Pundits tour, use product reviews as guide Competitors tour

    The All-Nighter Tour Never close the app, use the

    features continuously keep software running keep files open connect and dont disconnect dont save don t save move data around, add and remove sleep and hibernation modes ...

    Juha Itkonen, 2009 SoberIT

    21

  • AgendaAgenda Intelligent Manual Testing

    E i b t ti Experience base testing Exploratory testing

    Ways of Exploring Ways of Exploring Session Based Test Management Touring testingg g

    Intelligent Manual Testing Practices Examples of empirically identified

    testing practices Benefits of Experience Based Testing

    2010Juha Itkonen - SoberIT

    22

  • Empirically observed practices from industry Testing, not test case pre-design Practices work on different levels of abstraction

    Many practices are similar to traditional test case design techniques Many practices are similar to more general testing strategies,Many practices are similar to more general testing strategies,

    heuristics, or rules of thumb

    2010Juha Itkonen - SoberIT

    23

  • Overall strategies

    Structuring testing work Structuring testing work

    Overall strategies

    Guiding a tester through features Guiding a tester through features

    Low level test design Low level test design

    Detailed techniques

    Low level test design Defect hypotheses Checking the test results

    Low level test design Defect hypotheses Checking the test resultsgg

    2010Juha Itkonen - SoberIT

    24

  • Overall strategiesExploring weak

    areas

    Aspect oriented testing

    Data as test cases

    Exploratory

    User interface exploring

    Top-down

    Documentation based

    Exploring high-level test cases

    Top-down functional exploring

    Simulating a

    Checking new and changed

    featuresg

    real usage scenario

    Smoke testing b i i i dby intuition and

    experience

    2010Juha Itkonen - SoberIT

    25

  • Detailed techniquesDetailed techniques

    Testing alternative

    Testing input alternatives

    Testing boundaries andg

    ways

    Exploring against old functionality

    Input

    boundaries and restrictions

    Covering input combinations

    Exploratory

    Simulating abnormal and

    extreme situations

    Testing to-and-from the feature Comparing with

    another application or version

    Exploratory

    Persistence testing

    Comparison

    Comparing within the software

    Feature interaction testing

    Defect based

    Comparison

    Checking all the effects

    exploring End-to-end data check

    2010Juha Itkonen - SoberIT

    26

  • Basic Objectives in Testing ActivitiesBasic Objectives in Testing Activities

    Exploring: Guiding tester through the functionalityExploring: Guiding tester through the functionality

    Coverage: Selecting what gets tested and what notCoverage: Selecting what gets tested and what not

    Oracle: Deciding if the results are correctOracle: Deciding if the results are correct

    Risks: Detecting specific types of defectsRisks: Detecting specific types of defects

    P i iti ti S l ti h t t t t fi tPrioritization: Selecting what to test first

    2010Juha Itkonen - SoberIT

    27

  • Exploring weak areas Description: Exploring areas of the software that are weak or Description: Exploring areas of the software that are weak or

    risky based on the experience and knowledge of the tester.

    Goal: Reveal defects in areas that are somehow known to be risky. Focus testing on risky areas.

    complicated complicated coded in a hurry lots of changes coders' opinion testers' opinion based on who implementedbased on who implemented a hunch...

    2010Juha Itkonen - SoberIT

    28

  • Top-down functional exploring Description: Proceeding in testing by first going through typical Description: Proceeding in testing by first going through typical

    cases and simple checks. Proceed gradually deeper in the details of the tested functionality and applying more complicated tests.

    Goal: To get first high level understanding of the function and then deeper confidence on its quality set-by-step.deeper confidence on its quality set by step. Is this function implemented? Does it do the right thing?

    I th i i f ti lit ? Is there missing functionality? Does it handle the exceptions and special cases? Does is work together with the rest of the system?g y How about error handling and recovery

    2010Juha Itkonen - SoberIT

    29

  • Using data as test cases Description: Pre-defined test data set includes all relevant cases andDescription: Pre defined test data set includes all relevant cases and

    combinations of different data and situations. Covering all cases in a pre-defined test data set provides the required coverage.

    Testing is exploratory, but the pre-defined data set is used to achieve systematic coveragecoverage.

    Suitable for situations where data is complex, but operations simple. Or when creating the data requires much effort.

    Goal: To manage exploratory testing based on pre-defined test data. Achieve and measure coverage in exploratory testing.

    Example: Different types of customers in a CRM system Example: Different types of customers in a CRM system. User privileges Situation, services, relationships History, datay,

    2010Juha Itkonen - SoberIT

    30

  • Comparing within the software

    D i ti C i i il f t i diff t Description: Comparing similar features in different places of the same system and testing their consistency.

    Goal: Investigating and revealing problems in the consistency of functionality inside a software; helpconsistency of functionality inside a software; help decide if a feature works correctly or not.

    2010Juha Itkonen - SoberIT

    31

  • Testing to-and-from the feature

    D i ti Description: Test all things that affect to the feature Test all things that get effects from the featureTest all things that get effects from the feature

    Goal: Systematically cover the features interactionsGoal: Systematically cover the feature s interactions. Reveal defects that are caused by a not-the-most-obvious relationship between the tested feature and other features or environment.

    2010Juha Itkonen - SoberIT

    32

  • Ways of utilizing IMT PracticesWays of utilizing IMT Practices

    Training testersTraining testers

    Guiding test execution

    Test documentation and tracking

    Test patterns for different situationssituations

    Juha Itkonen - SoberIT

    332010

  • Training TestersTraining Testers

    T ti ti d i b d Testing practices are good, experience based, knowledge for intelligent testers

    Named and documented Named and documented Give common terminology and names that can be used to

    discuss how the testing should be done

    By learning these practices a novice tester could do better job Compared to just go and test around

    2010Juha Itkonen - SoberIT

    34

  • Guiding Test ExecutionGuiding Test Execution

    P ti t th ith hi h l l t t d t ti Practices together with a high level test documentation can be used as a test design

    Tester can choose applicable practices when doing Tester can choose applicable practices when doing exploratory testing More conscious decisionso e co sc ous dec s o s Better idea what the tester is actually doing Easier to maintain focus what am I going to achieve?

    2010Juha Itkonen - SoberIT

    35

  • Test Documentation and TrackingTest Documentation and Tracking

    Testing practices can be used in test specificationsg No need for detailed descriptions for the tester Tester knows what to do Other people know what has been done

    Test planning and design can focus on high level structure and coverage issuesstructure and coverage issues Not to teaching experienced tester how to test ;-) Example: p

    Use exploring high-level test cases to cover the functionality Apply Testing input alternatives and Testing boundaries and restrictions

    practices for each function In addition, use Comparing with another version practice to test that new

    GUI components correctly provide the existing features

    2010Juha Itkonen - SoberIT

    36

  • Test patternsTest patterns

    T ti ti ld b f th d l d Testing practices could be further developed Testing pattern will provide set of good testing practices

    For a certain testing problem and motivation For a certain testing problem and motivation With a certain testing goal Describing also the applicability (context) of the patternesc b g a so e app cab y (co e ) o e pa e

    2010Juha Itkonen - SoberIT

    37

  • AgendaAgenda Intelligent Manual Testing

    E i b t ti Experience base testing Exploratory testing

    Ways of Exploring Ways of Exploring Session Based Test Management Touring testingg g

    Intelligent Manual Testing Practices Examples of empirically identified testing

    practices

    Benefits of Experience Based T tiTesting

    2010Juha Itkonen - SoberIT

    38

  • Strengths of experience based testingStrengths of experience based testing Testers skills Utilizing the skills and experience of the tester Utilizing the skills and experience of the tester

    Testers know how the software is used and for what purpose Testers know what functionality and features are criticaly Testers know what problems are relevant Testers know how the software was built

    Risks tacit knowledge Risks, tacit knowledge

    Enables creative exploring Enables fast learning and improving testingEnables fast learning and improving testing

    Investigating, searching, finding, combining, reasoning, deducting, ...

    Testing intangible properties Testing intangible properties Look and feel and other user perceptions

    Juha Itkonen, SoberIT - 2009

    39

  • Strengths of experience based testingStrengths of experience based testing Process Agility and flexibility Agility and flexibility

    Easy and fast to focus on critical areas Fast reaction to changes g Ability to work with missing or weak documentation

    Effectiveness Effectiveness Reveals large number of relevant defects

    Efficiency Low documentation overhead Fast feedback

    Juha Itkonen, SoberIT - 2009

    40

  • Challenges of experience based testingChallenges of experience based testing Planning and tracking

    How much testing is needed how long does it take? How much testing is needed, how long does it take? What is the status of testing? How to share testing work between testers?

    Managing test coverage What has been tested? When are we done?When are we done?

    Logging and reporting Visibility outside testing team

    or outside individual testing sessions

    Quality of testing How to assure the quality of testers workHow to assure the quality of tester s work

    Detailed test cases can be reviewed, at least

    Juha Itkonen, SoberIT - 2009

    41

  • RESULTS OF A CONTROLLED STUDENT EXPERIMENT

    What is the benefit of d i i h b f h d?designing the test cases beforehand?Juha Itkonen, Mika V. Mntyl and Casper Lassenius"Defect Detection Efficiency: Test Case Based vs. Exploratory Testing, ESEM, 2007.

  • Research problemResearch problem

    Do testers performing manual functional testing

    with pre-designed test cases find more or different defects compared to testers working p g

    without pre-designed test cases?

    Juha Itkonen, 2009 SoberIT

    43

  • Research questionsResearch questions

    H d i d i d t t ff tHow does using pre-designed test cases affect

    1 the number of detected defects?1. the number of detected defects?

    2. the type of detected defects?e ype o de ec ed de ec s

    3. the number of false defect reports?

    False reports are: incomprehensible duplicate duplicate reporting a non-existent defect

    Juha Itkonen, 2009 SoberIT

    44

  • No difference in theNo difference in thetotal number of detected defects

    Found defects per subject

    Testing approach

    Feature Set

    Number of defects

    x FS A 44 6 28 2 172ET FS A 44 6.28 2.172FS B 41 7.82 2.522

    ET

    Total 85 7.04 2.462 FS A 43 5.36 2.288FS B 39 7.35 2.225

    TCT

    Total 82 6.37 2.456 x = mean and = standard deviation

    Difference shows 0.7 defects more in the ET approach T-test significance value of 0.088

    Not statistically significant

    Juha Itkonen, 2009 SoberIT

    45

  • Differences in defect TypesDifferences in defect Types

    C d t TCT ET d t t d Compared to TCT, ET detected More defects that were obvious to detect More defects that were difficult to detectMore defects that were difficult to detect More user interface and usability issues More low severity defects

    These are descriptive rather than conclusive findings Not statistically significant

    Juha Itkonen, 2009 SoberIT

    46

  • TCT produces more false defect reportsp pFalse defects per subject

    Testing Approach

    Feature Set

    False reports are: incomprehensible duplicate

    ti x

    FS A 1.00 1.396 FS B 1 05 1 191ET

    reporting a non-existent defect

    FS B 1.05 1.191ETTotal 1.03 1.291 FS A 1.64 1.564 FS B 2.50 1.867 TCT Total 2.08 1.767

    TCT produced on average 1.05 more false reports than ET per subject and testing session

    Difference is statistically significant with a Difference is statistically significant with a significance value 0.000 Mann-Whitney U test

    47

    Juha Itkonen, 2009 SoberIT

  • ConclusionsConclusions

    1 Th d t h d b fit f i d i d1. The data showed no benefits from using pre-designed test cases in comparison to freestyle exploratory testing approachin comparison to freestyle exploratory testing approach Defect type distributions indicate certain defect types might be

    better detected by ET But no significant differences

    2. Test-case based testing produced more false defect reportsreports Perhaps test cases made testers work more mechanically

    leading, e.g., to higher number of duplicate reports g g g p p

    Juha Itkonen, 2009 SoberIT

    48

  • Questions and more discussion

    Contact information

    Juha [email protected]+358 50 577 1688htt // b it h t fi/jitkhttp://www.soberit.hut.fi/jitkonen

    49

    Juha Itkonen - 2009 SoberIT

  • References (primary)References (primary)Bach, J., 2000. Session-Based Test Management. Software Testing and Quality Engineering, 2(6). Available at: http://www.satisfice.com/articles/sbtm.pdf. Bach, J., 2004. Exploratory Testing. In E. van Veenendaal, ed. The Testing Practitioner. Den Bosch: UTN Publishers, pp 253 265 http://www satisfice com/articles/et article pdfpp. 253-265. http://www.satisfice.com/articles/et-article.pdf.Itkonen, J. & Rautiainen, K., 2005. Exploratory testing: a multiple case study. In Proceedings of International Symposium on Empirical Software Engineering. International Symposium on Empirical Software Engineering. pp. 84-93.Itkonen J Mntyl M V & Lassenius C 2007 Defect Detection Efficiency: Test Case Based vs ExploratoryItkonen, J., Mntyl, M.V. & Lassenius, C., 2007. Defect Detection Efficiency: Test Case Based vs. Exploratory Testing. In Proceedings of International Symposium on Empirical Software Engineering and Measurement. International Symposium on Empirical Software Engineering and Measurement. pp. 61-70.Itkonen, J., Mantyla, M. & Lassenius, C., 2009. How do testers do it? An exploratory study on manual testing practices. In Empirical Software Engineering and Measurement, 2009. ESEM 2009. 3rd International Symposium on. Empirical Software Engineering and Measurement, 2009. ESEM 2009. 3rd International Symposium on. pp. 494-497.Lyndsay, J. & Eeden, N.V., 2003. Adventures in Session-Based Testing. http://www.workroom-productions.com/papers/AiSBTv1.2.pdf. Available at: http://www.workroom-productions.com/papers/AiSBTv1.2.pdf .Martin, D. et al., 2007. 'Good' Organisational Reasons for 'Bad' Software Testing: An Ethnographic Study of Testing in a Small Software Company In Proceedings of International Conference on Software Engineering Internationala Small Software Company. In Proceedings of International Conference on Software Engineering. International Conference on Software Engineering. pp. 602-611.Whittaker, J.A., 2009. Exploratory Software Testing: Tips, Tricks, Tours, and Techniques to Guide Test Design, Addison-Wesley Professional.

    Juha Itkonen, 2009 SoberIT

    50

  • References (secondary)References (secondary)Agruss, C. & Johnson, B., 2005. Ad Hoc Software Testing.Ammad Naseer & Marium Zulfiqar, 2010. Investigating Exploratory Testing inIndustrial Practice. Master's Thesis. Rnneby, Sweden: Blekinge Institute of Technology. Available at: http://www bth se/fou/cuppsats nsf/all/8147b5e26911adb2c125778f003d6320/$file/MSE 2010 15 pdfhttp://www.bth.se/fou/cuppsats.nsf/all/8147b5e26911adb2c125778f003d6320/$file/MSE-2010-15.pdf. Armour, P.G., 2005. The unconscious art of software testing. Communications of the ACM, 48(1), 15-18. Beer, A. & Ramler, R., 2008. The Role of Experience in Software Testing Practice. In Proceedings of EuromicroConference on Software Engineering and Advanced Applications. Euromicro Conference on Software Engineering and Advanced Applications pp 258-265Advanced Applications. pp. 258 265.Houdek, F., Schwinn, T. & Ernst, D., 2002a. Defect Detection for Executable Specifications An Experiment. International Journal of Software Engineering & Knowledge Engineering, 12(6), 637.Kaner, C., Bach, J. & Pettichord, B., 2002. Lessons Learned in Software Testing, New York: John Wiley & Sons, Inc.Martin, D. et al., 2007. 'Good' Organisational Reasons for 'Bad' Software Testing: An Ethnographic Study of Testing inMartin, D. et al., 2007. Good Organisational Reasons for Bad Software Testing: An Ethnographic Study of Testing in a Small Software Company. In Proceedings of International Conference on Software Engineering. International Conference on Software Engineering. pp. 602-611.Tinkham, A. & Kaner, C., 2003a. Learning Styles and Exploratory Testing. In Pacific Northwest Software Quality Conference (PNSQC). Pacific Northwest Software Quality Conference (PNSQC).Wood, B. & James, D., 2003. Applying Session-Based Testing to Medical Software. Medical Device & Diagnostic Industry, 90.Vga, J. & Amland, S., 2002. Managing High-Speed Web Testing. In D. Meyerhoff et al., eds. Software Quality and Software Testing in Internet Times. Berlin: Springer-Verlag, pp. 23-30.

    Juha Itkonen, 2009 SoberIT

    51