Top Banner

of 54

Automation Architecture 1

Apr 09, 2018

Download

Documents

hiprabu123
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
  • 8/8/2019 Automation Architecture 1

    1/54

    Training - Automation Architect

    Test Automation Strategy and Design

  • 8/8/2019 Automation Architecture 1

    2/54

    13 July 2009

    Agenda Day I Assurance Services

    Automation Strategy, Architecture, Automation Assessment and Consulting

    Test Automation Strategy Overview Objectives

    Automation Feasibility Analysis, Automation Prioritization

    Feasibility Analysis

    Feasibility Study Methodology

    Technical Feasibility

    Automation Quotient

    Automation Scope Definition

    Analysis

    Scope Definition

    Design

    Data Driven Framework

    Keyword Driven framework

    Build Verification Test / Deployment Verification Test

    Pilot

    Activities to be done in Pilot

    Pilot Reporting

  • 8/8/2019 Automation Architecture 1

    3/54

  • 8/8/2019 Automation Architecture 1

    4/54

    13 July 2009

    Introduction from a Consulting Firm

  • 8/8/2019 Automation Architecture 1

    5/54

    13 July 2009

    Automation Strategy

    Automation Objectives

    Faster to market Increase in Test EffectivenessEffort/Cost SavingDecrease in Tester dependency

    Automation Strategy Guidelines

    Test automation is a fulltime effort, not asideline.The test design and the test framework

    are totally separate entities.The test framework should be application-

    independent.

    The test framework must be easy toexpand and maintainThe test strategy/design should be

    framework independent.The test strategy/design should remove

    most testers from the complexities of thetest framework

  • 8/8/2019 Automation Architecture 1

    6/54

    13 July 2009

    Rationale Rationale behind the scope (Feasibility Analysis)

    Scope

    Test Types/ Activities

    Shakedown testing

    Regression

    Business process tests

    Data creation/ conditioning

    Other considerations

    Test case consolidation and prioritization

    Framework

    Data Driven

    Keyword /Table driven

    Hybrid

    Design

    Script Design

    Data Sheet Design

    Integration Design

    Component based model in BPT

    OthersEstimation, ROI

    Maintenance, Execution

    Test Automation Strategy and Design

  • 8/8/2019 Automation Architecture 1

    7/54

    Feasibility Analysis

  • 8/8/2019 Automation Architecture 1

    8/54

    13 July 2009

    Feasibility Analysis

    Automation Need Evaluation

    Type of Automation

    Activities/Type of tests to be automated Shakedown (tests to determine test-readiness)

    Regression tests

    Data creation/ conditioning activity

    Tests to be run on multiple builds/ releases

    IT strategy (choice of toolsets, application strategy..)

    Future upgrades

    Is there any major upgrade in the offering (if yes, check if automation would need amajor overhaul and if the tool proposed will support the technology of the proposedupgrade)

    Activity expected in future

    Is the application likely to be retired/replaced and therefore to see very little activity inthe near future

  • 8/8/2019 Automation Architecture 1

    9/54

    13 July 2009

    Automation Feasibility Study Methodology

    Study Sources

    Questionnaire

    Demo

    Discussions

    Documents

    Analysis

    DiscussionsDiscussion with Customerchampions on findings& recommendation

    Automation quotientAvailability of Automated scriptsTest coverage levelCompleteness of documents% of data variant test conditionsControl over input & outputRepetitiveness in test conditionsEffort distribution across testphasesFrequency of releasesPotential impact of patches/smallfixes

    Effort and time for automationmaintenanceStability of GUIChurn rate of regression test bedSME bottlenecksType of defects foundexplorative, functional, GUI

    Technical quotientBounded testsControl over input & output

    Manual interventionsConsistency of object propertiesObject identificationValidation complexity BusRules, Attribute validation, Staticor predictable resultsEffort required for Workarounds

    Other Factors Application Enhancement Roadmap Business criticality

    Recommendations Solution & Implementation approach Timeline & effort Estimated benefits

  • 8/8/2019 Automation Architecture 1

    10/54

    13 July 2009

    Sample Feasibility Analysis Document

    Sample Document

  • 8/8/2019 Automation Architecture 1

    11/54

    13 July 2009

    Tool EvaluationFactors to evaluate Test Tool

    Technology stream

    Support for the current technology stream

    Support for the new technologies being planned as per IT strategy (if any)

    Total cost for the tool

    Cost of acquiring license

    Cost of AMC

    Cost for add-ins required as per technology stream

    Cost for any additional integration exercise Comparison to be against similar license types (ex. global license if offshore is

    considered)

    Exit cost for leaving existing tools

    Sunk investment in direct assets (code etc.) that cannot be reused

    Cost of exiting resources because skill set is not required.

    Cost for changes invested in other tools (any custom integration with testmanagement tool etc.)

  • 8/8/2019 Automation Architecture 1

    12/54

    13 July 2009

    Tool EvaluationFactors to evaluate Test Tool

    The entry barrier

    Ease of training new resources

    Ease of finding skilled resources in the market Availability of functional libraries

    Ease of Integrating with other tools

    Strategic Vendor Rating

    Use "Gartner's Magic Quadrant" to see if the strategic direction of the vendor is in syncwith company needs.

    User comfort level Technical proficiency required to script

    Any special skill required to work on the tool

    Technical Support level

    How much support is required from the vendor?

    What kind of support is required and how often?

    Potential to resolve current issues If a current tool is being used, what are the issues faced?. Can the new tool handle it?

    Comparison of Market leading test tools capabilities

  • 8/8/2019 Automation Architecture 1

    13/54

    13 July 2009

    Tool Landscape

  • 8/8/2019 Automation Architecture 1

    14/54

    Automation Scope Definition

  • 8/8/2019 Automation Architecture 1

    15/54

    13 July 2009

    Scope Definition

    Regression Testing

    Shakedown Testing/BuildVerification Test/Smoketest

    Business process tests

    Data creation/conditioning

    Effort Analysis

    Test design

    Test Data setup

    Test execution

    Test Reporting

    Test Case Analysis

    Type of Test Case

    Number of Data variant Testcases

    Number of unique Validation

    Defect Analysis

    % of User Interface defects

    % of Withdrawn defects due totest data issue

    % of Regression Defects

    % of Defects Slippage

    % of Functional Defects

  • 8/8/2019 Automation Architecture 1

    16/54

    13 July 2009

    Automatability analysis

    Prioritization Analysis

    Phasing based onbusiness-context and

    availability

    Business criticality

    Facilitation

    Gap Analysis

    and completion

    UndertakeAutomation

    Analysis

    Ignore

    Evaluatefurther

    Business criticality

    Automatability

    Alternate

    methods/Selective

    Automation

    Alternatemethods/Selective

    Automation

    Ignore

    Ignore

    Alternate

    methods/Selective

    Automation

    Ready toAutomate

    Highpriority

    functions

    Completeness Analysis

    Complete andbusiness-critical

    test cases

    Automation Analysis

    Facilitation Analysis: Analysis of test case completeness, test case review and tool availability, accessibility and offshoreability

    Business Criticality Analysis: Analysis of Day-in-life (DIL) factor, system criticality, potential legislative impact, financial and consumer impact

    Automatability Analysis: Analysis of data-variance, potential degree of automation, control of input and output, complexity of case and setuprequirements, potential reusability across phases, potential savings/ benefits

    Function-level Analysis Test case Analysis

    Ready toAutomate

    Ready toAutomate

    Ignore

  • 8/8/2019 Automation Architecture 1

    17/54

    Design

  • 8/8/2019 Automation Architecture 1

    18/54

    13 July 2009

    Framework

    Data-Driven

    Keyword or Table Driven

    Test Input/Output valuesare read from data files

    Object and action arestored in the Script

    Easiest and Quickest toImplement

    Low potential for long-termsuccess

    Application building is apre-condition

    Data-Driven With

    Externalization other than

    data(Object/Part of actions)

    Keyword Driven with the

    business keywords(only

    step tables/Test Tables)

    Combination of Data-driven

    and Keyword Driven (due

    to knowledge constraints)

    Object, Action, InputData, and ExpectedResult all typicallyexternalized and in OneRecord

    Hardest and most time-consuming data drivenapproach to implement

    Greatest potential forlong-term success

    Table creation can startbefore Applicationbuilding

    Hybrid

    HP-Mercury QC BPT(Business Components

    model)

  • 8/8/2019 Automation Architecture 1

    19/54

    13 July 2009

    Data Driven Script Design

    Driver Script

    Initialize the setup required to run and call

    scripts in a desired order. Main Script

    Script calls the functional scripts in an order andexecutes the test case.

    Functional Script

    Common functionality or module being used inmany business functions.

    Recovery Scenario Script

    Run time exception handling scripts

    Report Script

    Script execution report generation scripts

    Utility Script

    Utility function scripts (Can be an VBS file also)

    Data SheetExcel sheet/Any other Data Source

    Data Grid

    Script LibraryMain Script

    RecoveryScenarioScript

    FunctionalScript

    ReportScript

    UtilityScript

    Application under Test

    Driver Script

  • 8/8/2019 Automation Architecture 1

    20/54

    13 July 2009

    Data Sheet Design

    Input

    Row heading for the screen

    Mandatory and Optional with different colors

    Field values in the list box

    Data Grid fields are directly mapped to the application objects

    Expected

    Separate section/Sheet Protect the cells from the user input

    Formulae to Calculate the expected

    Comparison

    Script/Data Sheet can be used

  • 8/8/2019 Automation Architecture 1

    21/54

    13 July 2009

    Keyword/Table Driven Framework

    Suite Driver Sheet andScript

    Run T 1 - YRun T 2 - YRun T N - N

    Datasheet T1 - Keywordsfor component, action,utility, results

    Object Map

    Component 1

    Component 2

    Component 3

    Key Word Interpreter

    Test Driver

    Functional Library

    Business Functions

    Utility Functions

    Component Functions

    Result Functions

    QC

    AUT* 1AUT* 3 AUT*2 Results

    QC** API

    ManualTesting

    Test Tool

    Application Under Test (AUT) **HP QC (Quality Center)

  • 8/8/2019 Automation Architecture 1

    22/54

    13 July 2009 - 22-

    BVT Automation frameworkKey Pointers

    Uses descriptive programming method

    XML and Excel based

    Up to 40%+ faster testing on builds

    Extendible - reports, checks

    Object checks parameterized - easy maintenance on UI changes or test requirement changes

    Catches defects early, saves rework

    Tool Independent XMLbased repository

    Application under testDefine granularityand format of report

    Object property listrepository

    Optimized propertylist for objects

    Object property listRepository

    ShakedownExecution

    CustomizedReports

    Expected objectproperty result sheet

    Data Flow sheet,additional checks

    Automatic generationof expected resultsfor UI validation

    Ready to use Scripts Customize and use Guidelines

  • 8/8/2019 Automation Architecture 1

    23/54

  • 8/8/2019 Automation Architecture 1

    24/54

    Pilot

  • 8/8/2019 Automation Architecture 1

    25/54

    13 July 2009

    Pilot Objectives and considerations

    Aims of the pilot :

    Define the Automation Frame work for the main phase

    Define guidelines for selecting test scenarios amenable for automation

    Identify the scenarios/function to be automated based on the ROI indicators measuredduring the Pilot phase of the project.

    Workout a detailed plan to automate the rest of the test cases (that can be automated)

    Investigate potential roadblocks for offshoring the automation project and come up withalternatives for the same.

    While scoping a Pilot, the following are key considerations:

    Select a representative set of business scenarios using a systematic approach

    Involving multiple human interaction points

    Multiple platform/ online-batch jumps/ interfaces

    Account for frequency of operation of a business case and the complexity of the test

    case

  • 8/8/2019 Automation Architecture 1

    26/54

    13 July 2009

    Pilot - Activities to be done in PilotFunctionality Identification

    Factors

    Complexity Coverage

    Priority

    Test cases Identification

    Coverage Including Positive and Negative

    Test Data Preparation

    Prepare Test Data for each test case

    Preconditioning the data

    Data sheet Preparation

    Identify the Fields needed for Functionality under test

    Prepare the Data Sheet with Identified fields as per the Data Sheet Design

    Make a entry of all the Test cases in the Data Sheet Key in the test data for each test cases in the corresponding fields

  • 8/8/2019 Automation Architecture 1

    27/54

    13 July 2009

    Pilot - Activities to be done in Pilot

    Automating the Functionality under Test

    Define every feature of an Functionality

    Modularize the Units of Functionality under Test Create all related steps of Functionality

    Do the necessary modifications to the script to make it Data Driven

    Unit Testing

    Execution

    Run the Script with the data available in the Data sheet

    Report the Execution Results as per the defined format

    Metrics to Management

    Benefits out of Automation

    Possible Risks and Mitigation

  • 8/8/2019 Automation Architecture 1

    28/54

    Estimation

  • 8/8/2019 Automation Architecture 1

    29/54

    13 July 2009

    Effort Estimation Criteria

    Categorization of Complexity

    no. of objects

    type of objects type of action

    number of steps

    number of verifications/syncs/outputs

    degree of reusability

    time for understanding

    time for coding (including visiting existing libraries for reusability)

    time for testing the scripts

    time for unique, reused

    time for user documentation

    time for review

    time for fix

    framework setup

    Dependency

  • 8/8/2019 Automation Architecture 1

    30/54

  • 8/8/2019 Automation Architecture 1

    31/54

    ROI Calculation

  • 8/8/2019 Automation Architecture 1

    32/54

    13 July 2009

    Payback from Automation

    Savings due to increased efficiency + additional productivity benefits

    Payback = -------------------------------------------------------------------------------------------

    Tool cost (or AMC) + Maintenance cost+ Build cost

    Note1 : Tool cost can be ignored if the tool has already been procured and is unutilized

    Other popular metrics include:

    % of strategic applications automated

    Test Execution savings due to automation (incl. reduction in TTM)

    % of errors due to testing

    Other benefits include:

    - Optimal utilization of tester time- Increased motivation..

  • 8/8/2019 Automation Architecture 1

    33/54

    13 July 2009

    Metrics Design Phase

    Test

    Auto

    matio

    n

    Phase

    Metric Objective What to collect When to collect Using the metrics

    Design Automation quotient

    per test case (depth)

    Measures depth of

    automation within a

    test case

    No of test steps automated in a test

    case, Total number of test steps in

    a test case

    Automation quotient =

    No of test steps automated in a test

    case

    --------------------------------------

    Total number of test steps in a test

    case

    At the end of

    design phase

    A low value indicates shallow

    automation. The selection

    of application/modules for

    test automation should be

    re-looked.

    Design % of reusable scripts

    for new

    enhancements or

    projects

    Measure of Reusability

    achieved

    No of reusable scripts used

    Total no. of components created new

    At the end of

    design phase

    High % indicates quicker

    time-to-build, automation

    can be used earlier in Test

    Development Life cycle

    (TDLC).

  • 8/8/2019 Automation Architecture 1

    34/54

    Sample Standards and Guidelines

  • 8/8/2019 Automation Architecture 1

    35/54

    13 July 2009

    Standards and Guidelines

    Need of Standards for Automation

    Standards are needed to achieve the following

    Readability

    Maintainability

    Clarity

    Uniformity

    Various Automation Standards

    Hardware Standards

    Software Standards

    Tool Setting Standards

    General Options and Test Settings

    Object Identification Settings

    Options Settings

    Editor Settings

    Standards and Guidelines Contd

  • 8/8/2019 Automation Architecture 1

    36/54

    13 July 2009

    Standards and Guidelines Contd

    Recording Standards

    Recording standards will allow to record the test as per the settings are set. Thesesettings are Tool specific. For Example,

    Mercury QTP and Winrunner - Recording Settings can be set to keep/Removespecific properties of the objects before starting the Recording

    Rational functional Tester Once the Object is recorded, Properties can be edited

    Coding Standards

    Test Script Function Name

    Version Ensure that Versioning is maintained for the Scripts

    Author Ensure that Author name is mentioned in the header of the script

    Comments Comments about the Script and Each unit of the script

    Descriptions/Definitions Ensure that all the variables are defined and Descriptionsare mentioned for Functions

    Parameters used - Ensure that all parameters indicate their nature (in/ out/ in-out) and-have appropriate comments where needed

    Modularization Ensure that Scripts are modularized based on Functional Units

    Length of the Script Ensure that Script is not too long

    Standards and Guidelines Contd

  • 8/8/2019 Automation Architecture 1

    37/54

    13 July 2009

    Standards and Guidelines Contd

    Path Hard Coding Eliminate hard coded paths

    Indentation - Code is to be indented for easy readability and debugging

    Defining Array Bounds - Arrays are defined as utility functions, and therefore they

    need to be dynamic in nature. Bounds should not be defined, allowing them to processdata of any size

    Defining Function - Anything abstracted at unit level, needs to be classified as afunction and stored accordingly in the utility repository

    Creating Functions afresh To increase efficiency and remove redundancy,Traceability matrix should be maintained to check if functions required are already

    available before scripting afresh Windows Flow control - Wherever possible, window flows to be defined using

    keywords. This would enable maximum reusability (can add new scenarios on the fly)and encourage reuse as well

    Nested Loops - Keep nested loops to a minimum wherever possible

    Synchronization Ensure that Event loops are used for flow between screens

    Reserved Words Ensure that no reserved words are used

  • 8/8/2019 Automation Architecture 1

    38/54

    13 July 2009

    Standards and Guidelines Contd

    Execution Standards

    Ensure that Same environment is set up for all the machines to be used for Execution

    Debugging Checklist

    Debugging checklist should be maintained. Below are the check points to be takencare

    Did the code work before?

    No Follow standard debugging

    Yes Check identical OS is used

    If not, OS Specific Problem

    If Yes, Check if the service packs are installed are of the same version

  • 8/8/2019 Automation Architecture 1

    39/54

    13 July 2009

    Data sheet Standards and Guidelines Contd

    Usability

    Ensure that Guidelines are provided for each field in the data sheet. Which will maketester independent.

    Ensure that possible values are provided to select in the data sheet

    Define each section clearly as Input, Output and Final Results

    If possible, allow Tester to edit only the input section

    Readability

    Conventions to be followed for each section of data sheet

    Names/Color code for the variables

  • 8/8/2019 Automation Architecture 1

    40/54

    13 July 2009

  • 8/8/2019 Automation Architecture 1

    41/54

    13 July 2009

    Thank You

    Mercury Quick Test Pro Best practices

  • 8/8/2019 Automation Architecture 1

    42/54

    13 July 2009

    Mercury Quick Test Pro Best practices

    Usability:

    Data Grid Design Highlights

    Row heading for the screen

    Mandatory and Optional with different colors

    Field values in the list box

    Data Grid fields are directly mapped to the application objects

    Document

    Execution Manual - Describes the activities to execute scripts

    Reporting

    Readable QTP results thru Report event

    Report written on the separate results workbook to port in to MQC

    Error message written in Data Grid used

    Mercury Quick Test Pro Best practices

  • 8/8/2019 Automation Architecture 1

    43/54

    13 July 2009

    Mercury Quick Test Pro Best practices

    Maintainability:

    Object Handling Minimize Maintenance effort

    Descriptive objects

    The test data and application object properties will be used as input to define the dynamic objects. Descriptive objects will be used to

    automate dynamic screens.

    Example: Account Networks, Benefit Option Network selection

    Retrofitting in the code

    Static objects

    These objects will be used to handle static screens/input. These Objects will be captured and stored in the test object repository.

    Example: Client Screen, Doc Gen, Tie contacts

    Retrofitting in the object repository

    Mercury Quick Test Pro Best practices

  • 8/8/2019 Automation Architecture 1

    44/54

    13 July 2009

    Mercury Quick Test Pro Best practices

    Maintainability (Contd..):

    Coding Standard

    A coding standards document which makes the entire development cogent and comprehensible Descriptions (Version, Parameters used & Global variables)

    Variable Definition

    Content & Length of the Script

    Comments

    Data Grid Standard

    Document which makes the entire development cogent and comprehensible

    Script Matrix

    This links together the various scripts used per type

    Traceability Matrix

    Traceability on Test cases with automation script

    Mercury Quick Test Pro Best practices

  • 8/8/2019 Automation Architecture 1

    45/54

    13 July 2009

    Mercury Quick Test Pro Best practices

    Integration:

    Results porting Tool

    Ports executed results to regression test set in MQC

    API built using QC Open Test Architecture

    Save Manual effort

    Reporting

    Report written on the separate results workbook to port in to MQC

    Mercury Quick Test Pro Best practices

  • 8/8/2019 Automation Architecture 1

    46/54

    13 July 2009

    Mercury Quick Test Pro Best practices

    Performance:

    Unmonitored runs

    Recovery Scenario

    Execution Speed

    Coding standards

    combination of descriptive and static objects

    Logical Synchronization (Wait statement is not used)

    Scripts are stored in local and QC integration mechanism build

    Mercury Quick Test Pro Best practices

  • 8/8/2019 Automation Architecture 1

    47/54

    13 July 2009

    Mercury Quick Test Pro Best practices

    Data Grid:

    Mercury Quick Test Pro Best practices

  • 8/8/2019 Automation Architecture 1

    48/54

    13 July 2009

    Mercury Quick Test Pro Best practices

    Reporting:

    Mercury Quick Test Pro Best practices

  • 8/8/2019 Automation Architecture 1

    49/54

    13 July 2009

    Mercury Quick Test Pro Best practices

    Descriptive objects:

    Mercury Quick Test Pro Best practices

  • 8/8/2019 Automation Architecture 1

    50/54

    13 July 2009

    Mercury Quick Test Pro Best practices

    Static object repository:

    Mercury Quick Test Pro Best practices

  • 8/8/2019 Automation Architecture 1

    51/54

    13 July 2009

    Mercury Quick Test Pro Best practices

    Script matrix:

    Mercury Quick Test Pro Best practices

  • 8/8/2019 Automation Architecture 1

    52/54

    13 July 2009

    Mercury Quick Test Pro Best practices

    Traceability matrix:

    Mercury Quick Test Pro Best practices

  • 8/8/2019 Automation Architecture 1

    53/54

    13 July 2009

    Mercury Quick Test Pro Best practices

    Integration:

    Pros/Cons Executing Cases From QC Executing cases outside QC and porting

    results as batch

    Ease of execution by a non tech professional Fast turnaround times

    Scheduling selective run can be done through

    QC

    Integrated view of results (manual and

    automated) possible

    Integrated view of results (manual and

    automated) is possible

    Easier to maintain the script

    QC link can become a bottle neck Cannot schedule tests from QC

    Latency issues

    Uploading scripts into QC takes longer

    Version management?

    Potential space availability

    Maintainability of scripts

    Other considerations

    Building a controller script which interfaces with

    the actual script and data grid

    QTP Results and the Data grid with updated

    results to be attached as an attachment to the

    test set

    Pros

    Good medium for sharing between onsite and

    offshore

    Cons

    Mercury Quick Test Pro Best practices

  • 8/8/2019 Automation Architecture 1

    54/54

    Mercury Quick Test Pro Best practices

    Integration (Contd..):