Top Banner
Where Value & Innovation Co-exist www.valuelabs.com 1 QA Training
90
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
  • Where Value & Innovation Co-exist

    www.valuelabs.com 1

    QA Training

  • Where Value & Innovation Co-exist

    www.valuelabs.com 2

    Overview

    Introduction

    Test Methodologies & Test Plan

    Communication & Documentation

    Test Cases & Test Reports

    Bugs & Bug Reporting

    Agenda

  • Where Value & Innovation Co-exist

    www.valuelabs.com 3

    Overview

  • Where Value & Innovation Co-exist

    www.valuelabs.com 4

    Introduction to Software & Software Engineering

    SDLC Models

    Waterfall

    RAD

    Incremental

    PDCA

    Contents

  • Where Value & Innovation Co-exist

    www.valuelabs.com 5

    Software

    Computer programs that provide desired features when

    executed

    Data structures that store and manipulate the information

    Documents that describe the usage and operation of the

    software

    Software Engineering

    The discipline concerned with creating and maintaining

    software applications

    Software is designed and developed by software engineers

    Software is engineered, not manufactured

    Software does not exhaust or get tired through overuse

    Introduction to Software & Software Engineering

  • Where Value & Innovation Co-exist

    www.valuelabs.com 6

    SDLC Models Waterfall Model

  • Where Value & Innovation Co-exist

    www.valuelabs.com 7

    SDLC Models RAD Model

  • Where Value & Innovation Co-exist

    www.valuelabs.com 8

    SDLC Models Incremental Model

  • Where Value & Innovation Co-exist

    www.valuelabs.com 9

    PDCA

    Is a logical sequence of systematic and documented activities

    aimed at improving a process.

    Improvements can be effected in two ways:

    By improving the process itself and/or

    By improving the outcomes of the process.

  • Where Value & Innovation Co-exist

    www.valuelabs.com 10

    Introduction

  • Where Value & Innovation Co-exist

    www.valuelabs.com 11

    Software Quality Via Quality Attributes

    QA, QC and Testing

    Role of QA in SDLC

    Need for Testing

    Testing Life Cycle

    Contents

  • Where Value & Innovation Co-exist

    www.valuelabs.com 12

    Quality Attributes

    Safety

    Security

    Reliability

    Robustness

    Understandability

    Testability

    Adaptability

    Software Quality Via Quality Attributes

    Software Quality

    Is defined as the bundle of

    attributes present in an

    application and, where

    appropriate, the level of the

    attribute for which the end-

    user holds a positive value

    Is high degree or grade of

    excellence

    Conformance to

    requirements

    Modularity

    Complexity

    Portability

    Usability

    Reusability

    Efficiency

    Learn ability

  • Where Value & Innovation Co-exist

    www.valuelabs.com 13

    Quality Assurance

    Purpose To manage and improve quality of process, product

    Quality Assurance is a set of processes to be followed to develop, test and

    maintain a software. It includes verification, review and monitoring activities

    for different stages of software development.

    Quality Assurance is a proactive activity, aims at Prevention of defects

    Is a function that manages quality, but does not refer the activity of inspecting

    the software

    Quality Control

    Purpose to produce defect free products

    Focuses on finding defects in specific deliverables

    The operational techniques and activities used to fulfill requirements for

    quality.

    Quality Control is a reactive activity, aims at Detection of defects

    Testing is one example of Quality Control, there are others such as inspections

    QA, QC and Testing

  • Where Value & Innovation Co-exist

    www.valuelabs.com 14

    Benefits of Quality Assurance

    Reduced Time to Market

    Better Customer Experience at Reduced cost

    Lowered Risk

    Reduced Rework

    Increased Job Satisfaction

    Serving as a Differentiator

    Efficient Framework for Outsourcing

  • Where Value & Innovation Co-exist

    www.valuelabs.com 15

    Testing

    Process used to identify the correctness, completeness and quality of the

    product

    The technical investigation of the product under controlled conditions

    with the intention to find bugs

    Testing begins at component level and works towards integrating the

    entire system

    Cant be performed by developers because one can never find their own

    flaws

    Exhaustive testing is often impossible. Choose best tests and define

    when to stop testing

    Testing is Dynamic Analysis of the product

  • Where Value & Innovation Co-exist

    www.valuelabs.com 16

    Role of QA in SDLC

    Requirements & Specification

    Assures that software requirements are complete, testable, and properly expressed as functional, performance, and interface requirements

    Assuring all software requirements are allocated to software Components

    Design Assuring that approved design standards are followed Assuring the approved design is placed under configuration management Assuring that results of design inspections are included in the design

    Coding

    Results of coding and design activities including the schedule contained in the Software Development Plan

    Status of all deliverable items Configuration management activities and the software development library

    Testing

    Assuring readiness for testing of all deliverable items Assuring that all tests are run according to test plans and procedures and

    that any non-conformances are reported and resolved Assuring that test reports are complete and correct Certifying that testing is complete and software and documentation are

    ready for delivery

    Delivery Ensure that all deliverable items are ready for delivery

  • Where Value & Innovation Co-exist

    www.valuelabs.com 17

    Ensure that software is bug free

    Ensure that system meets customer requirements and software

    specifications

    Ensure that system meets end user expectations

    We dont want customers to find bugs

    Fixing the bugs identified after release is expensive

    Need for Testing

  • Where Value & Innovation Co-exist

    www.valuelabs.com 18

    Testing Life Cycle

  • Where Value & Innovation Co-exist

    www.valuelabs.com 19

    Test

    Methodologies

    &

    Test Plan

  • Where Value & Innovation Co-exist

    www.valuelabs.com 20

    Testing Methodologies

    Software Testing Classification

    White Box Testing

    Black Box Testing

    Different Testing Methodologies

    The V Model

    Test Plan

    Need for Test Plan

    Problems

    Contents of a Test Plan

    Test Plan Template

    To Conclude

    Contents

  • Where Value & Innovation Co-exist

    www.valuelabs.com 21

    Testing Methodologies

    There are numerous methodologies available for testing a software.

    The methodology we choose depends on factors such as the nature

    of project, the project schedule, and resource availability

    Different applications require different ways of testing

    Same testing methodology cannot be implemented at each and

    every phase of testing

  • Where Value & Innovation Co-exist

    www.valuelabs.com 22

  • Where Value & Innovation Co-exist

    www.valuelabs.com 23

    Software Testing Classification

    Nature of Interaction with the product

    Static Analysis

    Dynamic Analysis

    Knowledge of implementation intricacies of the product

    White Box

    Black Box

    Mode of running actual tests on the product

    Manual

    Automation

    Although most of the intellectual processes of testing are nearly identical to

    that of review or inspection, the word Testing is connoted to mean the

    dynamic analysis of the product

  • Where Value & Innovation Co-exist

    www.valuelabs.com 24

    Software Testing Classification

  • Where Value & Innovation Co-exist

    www.valuelabs.com 25

    White Box Testing

    Definition and the necessity

    It is the testing that takes into account the internal mechanism of system or

    component.

    Process in which an application is tested through the code.

    Verification technique that is used to examine if the code works as expected

    This is done only for the applications for which the code has been provided for testing.

    It is checking the functionality of the module which include checking the module

    internals i.e. the coding (the loops and conditional statements) whether the loops are

    executed in a proper way.

    Also known as structural testing, clear box testing and glass box testing

  • Where Value & Innovation Co-exist

    www.valuelabs.com 26

    White Box Testing

    Methodologies

    Four main types of white-box testing

    Statement Testing: statement testing derives test cases to execute specific

    statements , normally to increase statement coverage .statement coverage

    is the assessment of the percentage of executable statements.

    Loop Testing:

    Cause execution of the loop to be skipped completely. (Exception: Repeat

    loops)

    Loop to be executed exactly once

    Loop to be executed more than once

    Path Testing: Ensures that all independent paths through code module have

    been tested

    Branch Testing (Conditional Testing): Test method which aims to ensure that

    each possible branch from each decision point (e.g. "if" statement) is executed

    at least once, thus ensuring that all reachable code is executed

  • Where Value & Innovation Co-exist

    www.valuelabs.com 27

    Advantages:

    Helps in optimizing code

    As the knowledge of internal coding structure is pre-requisite, it

    becomes very easy to find out which type of input/data can help in

    testing the application effectively.

    It helps in removing the extra lines of code, which can bring in

    hidden defects

    Reduction in the amount of debugging.

    Improvement in the quality of the released software.

    Disadvantages:

    As knowledge of code and internal structure is a prerequisite, a

    skilled tester is needed to carry out this type of testing, which

    increases the cost.

    It is nearly impossible to look into every bit of code to find out

    hidden errors, which may create problems, resulting in failure of the

    application.

    White Box Testing

  • Where Value & Innovation Co-exist

    www.valuelabs.com 28

    Black Box Testing

    Definition and the necessity

    Testing software based on output requirements and without any

    knowledge of the internal structure or coding in the program

    Process in which an application is not tested through the code.

    This is the most widely used testing approach, and is more in case

    of web sites.

  • Where Value & Innovation Co-exist

    www.valuelabs.com 29

    Advantages:

    The test is unbiased because the designer and the tester are independent of

    each other.

    The tester does not need to acquire knowledge of any specific programming

    languages.

    The test is done from the point of view of the user, not the designer.

    Test cases can be designed as soon as the specifications are complete.

    Disadvantages:

    The test can be redundant if the software designer has already run a test

    case.

    The test cases are difficult to design.

    Testing every possible input stream is unrealistic because it would take a

    inordinate amount of time; therefore, many program paths will go untested.

    Black Box Testing

  • Where Value & Innovation Co-exist

    www.valuelabs.com 30

    Different Testing Methodologies

    Unit testing: Unit testing is a development procedure where programmers

    create tests as they develop software. The tests are simple short tests that

    test functionality of a particular unit or module of their code

    Integration testing: Testing in which software components, hardware

    components, or both are combined and tested to evaluate the interaction

    between them

    System testing: Testing conducted on a complete, integrated system in

    totality, to evaluate the system's compliance with its specified requirements.

    Provides assurance that the software meets all functional, behavioral, and

    performance requirements established during requirements analysis.

  • Where Value & Innovation Co-exist

    www.valuelabs.com 31

    Sanity testing: Brief test of major functional elements of a piece of software to

    determine if its basically operational, so that the build can be accepted for further

    testing. Only basic tests on major functionalities are performed without bothering

    with finer details. Also referred as Smoke testing.

    Functionality testing: Checking for the correct functionality of the system i.e.

    to check whether desired output is obtained for a given valid input.

    Usability testing: is a means for testing/measuring how well people can use the

    application/product for its intended purpose

    Compatibility testing: Testing whether the system is compatible with other

    systems with which it should operate or communicate. To test the functionality

    and performance of a software application across multiple platform configurations

    like operating systems, browsers, databases, servers, clients, hardware, other

    software, previous versions of the software etc.

    Different Testing Methodologies

  • Where Value & Innovation Co-exist

    www.valuelabs.com 32

    Performance testing: The process of testing the run time performance of the

    software, to see whether the system is performing up to the clients performance

    requirements. Some of the aspects includes

    Connection time

    Response time

    Send time

    Process time

    Transaction time

    Load testing: Is to define the maximum amount of work a system can handle without

    significant performance degradation.

    Stress testing: Is the process of determining the ability of the system to maintain a

    certain level of effectiveness under unfavorable conditions. Evaluates the extent to

    which a system keeps working when subjected to extreme work loads or when some of

    its hardware or software has been compromised

    Different Testing Methodologies

  • Where Value & Innovation Co-exist

    www.valuelabs.com 33

    Acceptance Testing:

    Is to make sure the software works correctly for intended user(s) in his or her normal

    work environment (s).

    Alpha test-version of the complete software is tested by customer under the supervision

    of the developer at the developer's site. The developer observes the usage of the system,

    records usage problems and errors, analyzes the system for bugs to resolve them.

    Beta test-version of the complete software is tested by customer (or selected personnel)

    at his or her own site without the developer being present. The system is tested using real

    data in the real user environment that cannot be controlled by the developer. All problems

    encountered by the users would be reported back to the developer at regular intervals.

    User Acceptance test-is the final testing process that occurs before a new system is

    accepted for operational use by the client. It is to get a confirmation from the client of the

    object under test, through trail or review, that the system meets his requirement

    specifications

    Different Testing Methodologies

  • Where Value & Innovation Co-exist

    www.valuelabs.com 34

    Monkey testing: is to test an application with stochastic inputs without any specific

    tests in mind.

    Tests are not logical and there is no intent of learning the system.

    No test cases are used.

    Exploratory testing: is to test an application by exploring the application. Learning

    and testing goes hand in hand.

    Tests are logical with an intent of learning the system.

    No test cases are used.

    Ad hoc testing: is to test an application based on the acquaintance with that

    application functionality.

    Tests are logical with familiarity on system functionality.

    No test cases are used.

    Different Testing Methodologies

  • Where Value & Innovation Co-exist

    www.valuelabs.com 35

    Regression testing

    Testing performed to ensure that the changes made to the

    application doesnt effect the unchanged part of the application.

    Changes may be due to Bug fixes or enhancements.

    Retesting is just validating a bug fix and is not regression testing.

    Different Testing Methodologies

  • Where Value & Innovation Co-exist

    www.valuelabs.com 36

  • Where Value & Innovation Co-exist

    www.valuelabs.com 37

    Test Plan

    A test plan is a document describing the scope, approach,

    resources, and schedule of intended testing activities. It identifies

    test items, the features to be tested, the testing tasks, who will do

    each task, and any risks requiring contingency planning

    It is a systematic approach to testing a system such as a machine or

    software. The plan typically contains a detailed understanding of

    what the eventual workflow will be.

  • Where Value & Innovation Co-exist

    www.valuelabs.com 38

    Need for Test Plan

    The process of preparing a test plan is a useful way to think through

    the efforts needed to validate the acceptability of a software

    product.

    The completed document will help people outside the test group

    understand the 'why' and 'how' of product validation

    Software companies can not afford to develop and test their systems

    on an ad-hoc basis. A well-defined test methodology will help to

    ensure that companies develop their products in the most efficient

    and cost-effective manner.

  • Where Value & Innovation Co-exist

    www.valuelabs.com 39

    Problems

    When a project does not identify its overall approach to testing, then

    it gives to rise to problems listed below:

    No clearly defined roles and responsibilities

    No clear test objectives

    Ill-defined test documents

    No strong feedback loop

  • Where Value & Innovation Co-exist

    www.valuelabs.com 40

    Scope

    Scope clauses define what features will be tested. An aid to doing

    this is to prioritize them using a technique such as MoSCoW

    Test Items: The items of software, hardware, and combinations

    of these that will be tested.

    Features to Be Tested: The parts of the software specification to

    be tested.

    Features Not to Be Tested: The parts of the software

    specification to be excluded from testing.

  • Where Value & Innovation Co-exist

    www.valuelabs.com 41

    Resource

    Resource clauses give the overall view of the resources to deliver

    the tasks.

    Environmental Needs: What is needed in the way of testing

    software, hardware, offices etc.

    Responsibilities: Who has responsibility for delivering the various

    parts of the plan.

    Staffing And Training Needs: The people and skills needed to

    deliver the plan.

  • Where Value & Innovation Co-exist

    www.valuelabs.com 42

    Time

    Time clauses specify what tasks are to be undertaken to meet the

    quality objectives, and when they will occur.

    Testing Tasks: The tasks themselves, their dependencies, the

    elapsed time they will take, and the resource required.

    Schedule: When the tasks will take place.

  • Where Value & Innovation Co-exist

    www.valuelabs.com 43

    Quality

    Quality clauses define the standard required from the testing activities.

    Introduction: A high level view of the testing standard required, including what

    type of testing it is.

    Approach: The details of how the testing process will be followed.

    Item Pass/Fail Criteria: Defines the pass and failure criteria for an item being

    tested.

    Test Deliverables: Which test documents and other deliverables will be

    produced.

  • Where Value & Innovation Co-exist

    www.valuelabs.com 44

    Risk

    Risk clauses define in advance what could go wrong with a plan and the measures that will be taken to deal with these problems.

    Suspension Criteria And Resumption Requirements: This is a particular

    risk clause to define under what circumstances testing would stop and restart.

    Risks And Contingencies: This defines all other risk events, their likelihood,

    impact and counter measures to over come them.

  • Where Value & Innovation Co-exist

    www.valuelabs.com 45

    Test Plan Template

    ANSI/IEEE Standard 829-1983 specifies the following test plan outline:

    Test Plan Identifier

    Introduction

    Test Items

    Features to be Tested

    Features Not to Be Tested

    Approach

    Item Pass/Fail Criteria

    Suspension Criteria and Resumption Requirements

  • Where Value & Innovation Co-exist

    www.valuelabs.com 46

    Test Deliverables

    Testing Tasks

    Environmental Needs

    Responsibilities

    Staffing and Training Needs

    Schedule

    Risks and Contingencies

    Approvals

    Sample Test plan document

    Test Plan Template

  • Where Value & Innovation Co-exist

    www.valuelabs.com 47

    To Conclude

    The role of a test plan is to guide all testing activities. It defines what is to be tested

    and what is to be overlooked, how the testing is to be performed (described on a

    general level) and by whom.

    It is therefore a managerial document, not technical one

    - in essence; it is a project plan for testing.

  • Where Value & Innovation Co-exist

    www.valuelabs.com 48

    Test cases

    &

    Test Reports

  • Where Value & Innovation Co-exist

    www.valuelabs.com 49

    What is a test case?

    Why are test cases written?

    Contents in the Test Case Document

    Characteristics of a good test case

    A Sample Test case Document

    Sample Test cases

    Test report

    Importance of test reports

    Types of Test Reports

    Test results

    Sample Test Result document

    Test Incident report

    Test Summary Report

    Contents of a Summary Report

    Release Metrics

    Contents

  • Where Value & Innovation Co-exist

    www.valuelabs.com 50

    What is a test case?

    A set of test inputs, execution conditions, and expected results developed for a

    particular objective, such as to exercise a particular program path or to verify

    compliance with a specific requirement.

    Documentation specifying inputs, predicted results, and a set of execution conditions

    for a test item.

    The main goal of designing any test case is to find bugs in the software under test.

    Thus, it is important that the tester designs test cases that have the highest likelihood

    of finding the most errors with a minimum amount of time and effort.

    Note that the process of developing test cases can help find problems in the

    requirements or design of an application, since it requires completely thinking through

    the operation of the application. For this reason, it's useful to prepare test cases early

    in the development cycle if possible.

  • Where Value & Innovation Co-exist

    www.valuelabs.com 51

    Why are test cases written?

    To ensure coverage of requirements.

    Coverage of all functional requirements

    Coverage of all non-functional requirements

    Segregates the functional and non functional requirements into

    different testing types.

    Provides common understanding of requirements among the

    testers.

    Serves as a single point reference during test execution.

    Provides metric for percentage of tests executed, streamlines the

    testing process.

  • Where Value & Innovation Co-exist

    www.valuelabs.com 52

    Contents in the Test Case Document

    Generally, a test case document contains the following sections:

    Test Case ID

    Test Case Title

    Procedure

    Expected Result

    Other information that can be included are

    Section (Application feature)

    Test Type (User interface, functionality)

    Priority (Importance of test case)

    Pre Conditions

    Requirement# etc

  • Where Value & Innovation Co-exist

    www.valuelabs.com 53

    Characteristics of a good test case

    Can be categorized under two groups

    Design and functionality related

    Accurate

    Reasonable (probability of catching a bug)

    Realistic (possibility of occurrence)

    Feasible (practically possible to execute)

    Unique (not redundant)

    Documentation and Management related

    Reviewable

    Traceable

    Reusable

    Well structured individually, well organized as a group

    Maintainable

    Ease logging test results

  • Where Value & Innovation Co-exist

    www.valuelabs.com 54

    A Sample Test case Document

  • Where Value & Innovation Co-exist

    www.valuelabs.com 55

    Sample Test cases

  • Where Value & Innovation Co-exist

    www.valuelabs.com 56

    Test report

    The Software Test Report (STR) is a record of the qualification testing

    performed on a software system or subsystem, or other software-related

    item.

    A document describing the conduct and results of the testing carried out for a

    system or system component.

    A document produced at the end of the test process summarizing all testing

    activities and results is called a test report. It also contains an evaluation of

    the test process and lessons learned.

    Once testing is completed, testers generate metrics and make final reports on

    their test effort and whether or not the software tested is ready for release.

  • Where Value & Innovation Co-exist

    www.valuelabs.com 57

    Importance of test reports

    Test Report is important

    To measure efficiency of the QA team in finding bugs

    To help the acquirer assess the testing and its results

    To help decide about the release of the product/application

    To list the Known issues during Go Live decision

  • Where Value & Innovation Co-exist

    www.valuelabs.com 58

    Types of Test Reports

    Software test reporting happens at two phases:

    Reporting while running the tests

    Test Log/Test results

    Test Incident Reports

    Reporting after the completion of testing

    Test Summary reports

    Release metrics

  • Where Value & Innovation Co-exist

    www.valuelabs.com 59

    Test results

    Test result document consists of the result of each test case execution. The

    contents of the Test Result document:

    Case Summary

    Test Environment Details

    Test case ID

    Test Type

    Test Case Title

    Expected Result

    Actual Result

    Test result (Passed/Failed/Skipped)

    Bug ID

    Notes/Comments

  • Where Value & Innovation Co-exist

    www.valuelabs.com 60

    Sample Test Result document

  • Where Value & Innovation Co-exist

    www.valuelabs.com 61

    Communication & Documentation

  • Where Value & Innovation Co-exist

    www.valuelabs.com 62

    Communication through E Mail

    MS Word

    MS Excel

    MS Power Point

    MS Paint

    MS Visual Source Safe (VSS)

    Contents

  • Where Value & Innovation Co-exist

    www.valuelabs.com 63

    Communication through E Mail

    Introduction to Communication through E-mail

    Anatomy of an E-mail Message: Facets and Structure

    Communication through E-mail with Peers

    Communication through E-mail with Superiors

    Communication through E-mail with Client

  • Where Value & Innovation Co-exist

    www.valuelabs.com 64

    MS Word

    To make a text bold, italic, and underlined

    Select the text.

    Click B, I, and U respectively from the Formatting toolbar .

    To create a Hyperlink to a Bookmark

    Click Bookmark from the Insert menu to get the Bookmark window.

    Enter a name for the bookmark and click the Add button.

    Click Hyperlink from the Insert menu to get the Insert Hyperlink

    window.

    Select Place in This Document to get a list of all of the bookmarks in

    the current document.

    Select a bookmark and click OK in the Insert Hyperlink window.

    To insert a table using the Toolbar, click the Insert Table icon on the

    toolbar.

  • Where Value & Innovation Co-exist

    www.valuelabs.com 65

    To create a Table of Contents

    Apply Heading 1 style to the major headings in your document.

    Apply Heading 2 style to sub-headings, Heading 3 style to sub-sub-

    headings etc.

    Click where you want the Table of Contents to appear.

    Select Insert > Reference > Index and Tables.

    Click the Table of Contents tab.

    Click OK to get a Table of Contents.

    To highlight a word or a sentence

    Select that word or sentence.

    Click the Highlight icon from the Formatting toolbar.

    MS Word

  • Where Value & Innovation Co-exist

    www.valuelabs.com 66

    To insert a page break

    Click Break from the Insert menu. Break window appears.

    Select Page break from the Break window.

    Click OK.

    To insert a picture

    Select Picture from the Insert menu.

    Click Clip Art, or From File, or From Scanner or Camera, or New

    Drawing, or AutoShapes, or WordArt, or Organization Chart, or Chart,

    and follow the directions and prompts that are given by Microsoft Word.

    To check spelling of a text

    Select the text.

    Click the Spelling and Grammar icon from the Toolbar.

    MS Word

  • Where Value & Innovation Co-exist

    www.valuelabs.com 67

    MS Excel

    To find the sum of data

    Select the cells containing the data.

    Click the AutoSum icon on the Toolbar. The result of the summation

    appears in the cell adjacent to the selection.

    To insert a comment

    Select the cell where you want to insert the comment.

    Click Comment from the Insert menu. A comment box appears.

    Type your comment inside the comment box.

    To create formulas

    Click the cell where you want to enter the formula .

    Type = (an equal sign).

    Click the Function button.

    Select the formula you want, and follow the on-screen instructions.

  • Where Value & Innovation Co-exist

    www.valuelabs.com 68

    MS Excel

    To insert a chart

    Select the data you want to make your chart with.

    Click Chart on the Insert menu. Chart Wizard appears.

    Select the type of chart you want.

    Confirm or change your data range.

    Update the Chart Options.

    Select whether you want to put the chart in the current worksheet or in a

    new worksheet.

  • Where Value & Innovation Co-exist

    www.valuelabs.com 69

    MS Power Point

    After you open Microsoft PowerPoint, a screen pops up asking if you would

    like to create a New Presentation or Open An Existing Presentation.

    AutoContent Wizard creates a new presentation by prompting you for

    information about content, purpose, style, handouts, and output. The new

    presentation contains sample text that you can replace with your own

    information. Follow the directions and prompts that are given by Microsoft

    PowerPoint.

    Design Template creates a new presentation based on one of the

    PowerPoint design templates supplied by Microsoft. Use what is already

    supplied by Microsoft PowerPoint and change the information to your own.

    Blank Presentation creates a new, blank presentation using the default

    settings for text and colors.

  • Where Value & Innovation Co-exist

    www.valuelabs.com 70

    MS Paint

    To add a screen shot to a document

    Capture a screen or an active window.

    Open Paint.

    From File menu, click New. A new work area appears.

    From the Edit menu, click Paste. The image of the screen or the active

    window gets pasted on the new work area.

    Select the entire image or a portion of the image with the help of the

    Select tool.

    Click Copy or Cut from the Edit menu.

    Click Paste in the document where you want the entire image or the

    portion of the image to appear.

    You can highlight an error or a button with the help of Rectangle or Ellipse

    tool.

  • Where Value & Innovation Co-exist

    www.valuelabs.com 71

    MS Visual Source Safe (VSS)

    It is a Revision Control tool for managing multiple revisions of the same unit

    of information.

    It is most commonly used in software development to manage ongoing

    development of digital documents like application source code and other

    critical information that may be worked on by a team.

    VSS creates a "virtual library" of files.

    Users can read any of the files in the library at any time, but in order to

    change them, they must first check out the file.

    They are then allowed to modify the file and finally check in it.

    Only one user can check out a file at a time.

  • Where Value & Innovation Co-exist

    www.valuelabs.com 72

    Bugs And Bug Reporting

  • Where Value & Innovation Co-exist

    www.valuelabs.com 73

    Bug

    What is a bug?

    Cost of bug

    Importance of bug in testing life cycle

    Different states of a bug

    Bug reporting

    Bug Report

    Why Bug Reporting?

    Bug report structure

    Sample Bug report

    Ten tips for good bug report

    Bug Life Cycle

    Bug severity, likelihood, and priority

    Severity and Priority

    Summary

    Bug Tracking tool

    Contents

  • Where Value & Innovation Co-exist

    www.valuelabs.com 74

    BUG

    Why do we call a software defects as bug??

    First used because a moth flew into a vacuum-tube computer and was a

    cause for a system break.

    First computer bug was a moth found trapped between points at Relay # 70,

    Panel F, of the Mark II Aiken Relay Calculator (a vacuum-tube computer)

    while it was being tested at Harvard University, 9 September 1947. The

    operators affixed the moth to the computer log, with the entry: "First

    actual case of bug being found". They put out the word that they had

    "debugged" the machine, thus introducing the term "debugging a computer

    program".

  • Where Value & Innovation Co-exist

    www.valuelabs.com 75

    What is a Bug?

    Bug (often called as Defect) is:

    an unwanted behavior of the system

    a mismatch to the functional requirements of the system

    unexpected behavior from product end-user point of view

    an unexpected item, a diversion from the expected result, a design which is not as

    per requirements, a text which is wrong, an incorrect value, a fault in data

    validation, a missing line etc

    Bug is the out come of software testing, which is the main source to ensure

    product stability

  • Where Value & Innovation Co-exist

    www.valuelabs.com 76

    Cost of fixing bugs

    The cost of defects found early are less when compared with later stages.

    $1 $3.50$7.50

    $51

    $101

    $0

    $20

    $40

    $60

    $80

    $100

    $120

    Req

    uire

    men

    t

    Des

    ign

    Cod

    ing

    Test

    ing

    Maint

    enan

    ce

    Cost of fixing defects

    Cost

  • Where Value & Innovation Co-exist

    www.valuelabs.com 77

    Importance of bug in testing life cycle

    Bug is the only source to represent the system misbehavior

    Bugs/Defects have cost associated with them

    It is necessary to identify all the bugs before the product is launched

    The high priority bugs should be addressed first and fixed to release the

    product/application

  • Where Value & Innovation Co-exist

    www.valuelabs.com 78

    Different states of a bug

    New: A bug is reported and is yet to be assigned to developer

    Open: The test lead approves the bug

    Assigned: Assigned to a developer and a fix is in progress

    Need more info: When the developer needs information to reproduce the bug

    or to fix the bug

    Fixed: Bug is fixed and waiting for validation

    Closed: The bug is fixed, validated and closed

    Rejected: Bug is not genuine

    Deferred: Fix will be placed in future builds

    Duplicate: Two bugs mention the same concept

    Invalid: Not a valid bug

    Reopened: Bug still exists even after the bug is fixed by the developer

  • Where Value & Innovation Co-exist

    www.valuelabs.com 79

    Bug reporting

    The aim of a bug report is to enable the programmer to see the program

    failing in front of them, by giving them careful and detailed instructions on

    how to make it fail. If they can make it fail, they will try to gather extra

    information until they know the cause. If they can't make it fail, they will

    have to ask you to gather that information for them.

    A bug report should contain the actual facts

    The purpose of reporting the bug is to get the bug fixed.

  • Where Value & Innovation Co-exist

    www.valuelabs.com 80

    Bug Report

    To write a fully effective report you must:

    - Explain how to reproduce the problem

    - Analyze the error so you can describe it in a minimum number of steps.

    - Write a report that is complete and easy to understand.

    Write bug reports immediately; the longer you wait between finding the

    problem and reporting it, the more likely it is the description will be

    incomplete, the problem not reproducible, or simply forgotten.

  • Where Value & Innovation Co-exist

    www.valuelabs.com 81

    Why Bug Reporting?

    Preparing Bug reports is extremely important for the following reasons:

    Is to enable the programmer see the program failure, by giving them

    careful and detailed instructions on how to make it fail.

    Analysis of the build stability can be done by bug reporting

    Will help in keeping track of the issues and helps in stabilizing the system

  • Where Value & Innovation Co-exist

    www.valuelabs.com 82

    Bug report structure

    Title of the bug - brief description

    Steps to reproduce

    Accounts used to generate the bug

    Release/build in which the bug is identified

    Reproducibility frequency

    Severity

    Environment details

    Screenshots

  • Where Value & Innovation Co-exist

    www.valuelabs.com 83

    Sample Bug report

  • Where Value & Innovation Co-exist

    www.valuelabs.com 84

    Ten tips for good bug report

    Structure: test carefully (Use deliberate, careful approach to testing)

    Reproduce: test it again (rule of thumb 3 times)

    Isolate: test it differently (Change variables that may alter symptom)

    Generalize: test it elsewhere

    Does the same failure occur in other modules or locations?

    Compare: review results of similar tests

    (Same test run against earlier versions)

    Summarize: relate test to customers (Put a short tag line on each report)

    Condense: trim unnecessary information (Eliminate extraneous words or steps)

    Disambiguate: use clear words (Goal: Clear, indisputable statements of fact)

    Neutralize: express problem impartially (Deliver bad news gently)

    Review: be sure (Peer review)

  • Where Value & Innovation Co-exist

    www.valuelabs.com 85

    Bug Life Cycle

    Yes

    No

  • Where Value & Innovation Co-exist

    www.valuelabs.com 86

    Bug severity, likelihood, and priority

    Every bug would have two attributes

    Severity: The impact of a bug on a system or user

    Likelihood: The probability of occurrence / frequency

    Severity and Likelihood helps in deciding priority.

    Bug priority establishes the importance and order in which a bug should be

    fixed.

    The urgency of a bug fix is decided in bug triage meetings.

    This field is used by maintainers and release coordinators to prioritize

    work that still needs to be done.

    Changes w.r.t. to the release date,

    P4 issue may become P1 in the last build before release.

  • Where Value & Innovation Co-exist

    www.valuelabs.com 87

    Severity and Priority

    Submitter of defect chooses severity and likelihood

    May later correct if determined to be an exaggeration or in error

    Priority is assigned and changed w.r.t. time

    Product Owners / Managers / Leads may change the priority using their

    judgment.

    Priority is Business; Severity is Technical.

    Priority is Relative; Severity is Absolute.

  • Where Value & Innovation Co-exist

    www.valuelabs.com 88

    Summary

    Be ready to provide extra information if the programmer needs it. If they

    didn't need it, they wouldn't be asking for it. They aren't being deliberately

    awkward. Have version numbers at your fingertips, because they will probably

    be needed.

    Write clearly. Say what you mean, and make sure it can't be misinterpreted.

    Above all, be precise. Programmers like precision

  • Where Value & Innovation Co-exist

    www.valuelabs.com 89

    Bug Tracking tools

    A tool that facilitates the recording and status tracking of incidents (or

    issues/bugs) found during testing is a bug tracking/incident tracking tool.

    They often have workflow-oriented facilities to track and control the

    allocation, correction and re-testing of incidents and provide reporting

    facilities.

    Different bug tracking tools :

    Bugzilla

    QTrack

    Teamtrack

    Test Director

    Fogbugz

  • Where Value & Innovation Co-exist

    www.valuelabs.com 90