Top Banner
VALIDATION, VERIFICATION, AND TESTING PLAN Project or System Name U.S. Department of Housing and Urban Development Month, Year
27
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript

Validation, Verification, and Testing Plan Template

VALIDATION,

VERIFICATION, AND

TESTING PLAN

Project or System Name

U.S. Department of Housing and Urban DevelopmentMonth, YearRevision SheetRelease No.DateRevision Description

Rev. 05/30/00Validation, Verification, and Testing Plan Template and Checklist

Rev. 14/12/02Conversion to WORD 2000 format

Validation, Verification and Testing Plan Authorization Memorandum

I have carefully assessed the Validation, Verification, and Testing Plan for the (System Name). This document has been completed in accordance with the requirements of the HUD System Development Methodology.

MANAGEMENT CERTIFICATION - Please check the appropriate statement.

______ The document is accepted.

______ The document is accepted pending the changes noted.

______ The document is not accepted.

We fully accept the changes as needed improvements and authorize initiation of work to proceed. Based on our authority and judgment, the continued operation of this system is authorized.

____________________________________________________

NAME

DATE

Project Leader

____________________________________________________

NAME

DATE

Operations Division Director

____________________________________________________

NAME

DATE

Program Area/Sponsor Representative

____________________________________________________

NAME

DATE

Program Area/Sponsor Director

VALIDATION, VERIFICATION AND TESTING PLAN

TABLE OF CONTENTSPage #

1-1

1.0GENERAL INFORMATION

1.1Purpose1-1

1.2Scope1-1

1.3System Overview1-1

1.4Project References1-2

1.5Acronyms and Abbreviations1-2

1.6Points of Contact1-2

1.6.1Information1-2

1.6.2Coordination1-2

2.0TEST EVALUATION2-1

2.1Requirements Traceability Matrix2-1

2.2Test Evaluation Criteria2-1

2.3User System Acceptance Criteria2-1

3.0TESTING SCHEDULE3-1

3.1Overall Test Schedule3-1

3.2Security3-1

3.x[Testing Location Identifier]3-1

3.x.1Milestone Chart3-1

3.x.2Equipment Requirements3-2

3.x.3Software Requirements3-2

3.x.4Personnel Requirements3-2

3.x.5Deliverable Materials3-2

3.x.6Testing Tools3-2

3.x.7Site Supplied Materials3-2

4.0TESTING CHARACTERISTICS4-1

4.1Testing Conditions4-1

4.2Extent of Testing4-1

4.3Data Recording4-1

4.4Testing Constraints4-1

4.5Test Progression4-1

4.6Test Evaluation4-1

4.6.1Test Data Criteria4-1

4.6.1.1Tolerance4-1

4.6.1.2System Breaks4-2

4.6.2Test Data Reduction4-2

5.0TEST DESCRIPTION5-1

5.x[Test Identifier]5-1

5.x.1System Functions5-1

5.x.2Test/Function Relationships5-1

5.x.3Means of Control5-1

5.x.4Test Data5-1

5.x.4.1Input Data5-1

5.x.4.2Input Commands5-1

5.x.4.3Output Data5-1

5.x.4.4Output Notification5-2

5.x.5Test Procedures5-2

5.x.5.1Procedures5-2

5.x.5.2Setup5-2

5.x.5.3Initialization5-2

5.x.5.4Preparation5-2

5.x.5.5Termination5-3

1.0GENERAL INFORMATION

NOTE TO AUTHOR: Highlighted, italicized text throughout this template is provided solely as background information to assist you in creating this document. Please delete all such text, as well as the instructions in each section, prior to submitting this document. ONLY YOUR PROJECT-SPECIFIC INFORMATION SHOULD APPEAR IN THE FINAL VERSION OF THIS DOCUMENT.The Validation, Verification, and Testing Plan provides guidance for management and technical efforts throughout the test period. It establishes a comprehensive plan to communicate the nature and extent of testing necessary for a thorough evaluation of the system. This plan is used to coordinate the orderly scheduling of events by providing equipment specifications and organizational requirements, the test methodology to be employed, a list of the test materials to be delivered, and a schedule for user (tester) orientation and participation. Finally, it provides a written record of the required inputs, execution instructions, and expected results of the system test.1.0 GENERAL INFORMATION

1.1Purpose

Describe the purpose of the Validation, Verification, and Testing Plan.

1.2Scope

Describe the scope of the Validation, Verification, and Testing Plan as it relates to the project.

1.3System Overview

Provide a brief system overview description as a point of reference for the remainder of the document. In addition, include the following:

Responsible organization

System name or title

System code

System category

Major application: performs clearly defined functions for which there is a readily identifiable security consideration and need

General support system: provides general ADP or network support for a variety of users and applications

Operational status

Operational

Under development

Undergoing a major modification

System environment and special conditions1.4Project References

Provide a list of the references that were used in preparation of this document. Examples of references are: Previously developed documents relating to the project Documentation concerning related projects HUD standard procedures documents1.5Acronyms and Abbreviations

Provide a list of the acronyms and abbreviations used in this document and the meaning of each.

1.6Points of Contact

1.6.1Information

Provide a list of the points of organizational contact (POCs) that may be needed by the document user for informational and troubleshooting purposes. Include type of contact, contact name, department, telephone number, and e-mail address (if applicable). Points of contact may include but are not limited to helpdesk POC, development/maintenance POC, and operations POC.

1.6.2Coordination

Provide a list of organizations that require coordination between the project and its specific support function (e.g., installation coordination, security, etc.). Include a schedule for coordination activities.

2.0TEST EVALUATION

2.0 TEST EVALUATION

2.1Requirements Traceability Matrix

Prepare a functions/test matrix that lists all application functions on one axis and cross-reference them to all tests included in the test plan.

2.2Test Evaluation Criteria

Decide the specific criteria that each segment of the system/subsystem must meet. Such criteria are described by the user of the system/subsystem and typically are a mix of functional and performance requirements, such as processing data within a certain time frame, producing a report, or responding to an online query within a certain amount of time.2.3User System Acceptance Criteria

Describe the minimum function and performance criteria that must be met for the system to be accepted as fit for use by the user or sponsoring organization.

3.0TESTING SCHEDULE3.0 TESTING SCHEDULE

3.1Overall Test Schedule

Prepare a testing schedule to reflect the unit, integration, system acceptance, and release tests, as well as the time duration of each. This schedule should reflect the personnel involved in the test effort and the site location. In the test schedule, include the following information:

Documentation review

Test scripts

Data preparation

Test execution

Output review

System certification

System release

Return of test site to pretest condition

3.2Security

Prepare a list of requirements necessary to ensure the integrity of the testing procedures, data, and site. Any special security considerations (e.g., passwords, classifications, security or monitoring software, or computer room badges) should be described in detail.

3.x[Testing Location Identifier]

This section provides a description of testing locations. Each location should be under a separate section header, 3.3 - 3.x. Identify the location at which the testing will be conducted, and the organizations participating in the test. List the tests to be performed at this location.

3.x.1Milestone Chart

Provide a chart to depict the activities and events listed below. When preparing the chart, give consideration to all tests scheduled for this location. The activities and events will be presented in chronological order with supporting narrative, as necessary, and will depict, for example:

The overall on-site test period by calendar date, and portions of the period assigned to major portions of the test.

The pretest on-site period required for system test team orientation, familiarization, and for system debugging.

The period assigned for the collection of database values, input values, and other operational data required for system test.

The period assigned for user training, operator training, maintenance and control group training, and management orientation briefing.

The period assigned for preparation, review, and approval of the test analysis report.

3.x.2Equipment Requirements

Provide a chart or listing of the period of usage and quantity required of each item of equipment employed throughout the test period in which the system is to be tested. Any communications and test data reduction equipment will be included.

3.x.3Software Requirements

Identify any software required in support of the testing when it is not a part of the system being tested. Include systems support, communications, and applications software, their recording and storage media, version number, and media type.

3.x.4Personnel Requirements

Provide a listing of the personnel necessary to perform the test. For each of the personnel, this listing should provide the following information:

Name, title, current organization, grade (if known), and level of security background investigation

Description of the required tasks to be performed

Geographical location of the work to be performed

Time required (dates needed)

Whether the requirement is full time, part time, or as needed

Any special skills required (i.e., programming language, machine familiarity)

3.x.5Deliverable Materials

Itemize all materials that will be delivered as part of the system test, to include the quantity and full identification.

3.x.6Testing Tools

Identify the testing tools to be used during the preparation for and execution of the test.

3.x.7Site Supplied Materials

Describe any materials required to perform the test that need to be supplied at the test site. These materials could include desks, chairs, special equipment, office supplies, database and its media, as well as other input and their media.

4.0TESTING CHARACTERISTICS

4.0 TESTING CHARACTERISTICS

4.1Testing Conditions

Indicate whether the testing will use the normal input and database or whether some special test input is to be used.

4.2Extent of Testing

Indicate the extent of the testing to be employed. Where limited testing is to be employed, the test requirements will be presented either as a percentage of some well-defined total quantity or as a number of samples of discrete operating conditions or values. Also, indicate the rationale for adopting limited testing.

4.3Data Recording

Indicate data recording requirements for the testing process, including data not normally recorded during system operation.

4.4Testing Constraints

Indicate the anticipated limitations imposed on the testing because of system or test conditions (timing, interfaces, equipment, personnel).

4.5Test Progression

In progressive or cumulative tests, include an explanation concerning the manner in which progression is made from one test to another so that the cycle or activity for each test is completely performed.

4.6Test Evaluation

4.6.1Test Data Criteria

Describe the rules by which test results will be evaluated.

4.6.1.1Tolerance

Discuss the range over which a data output value or a system performance parameter can vary and still be considered acceptable.

4.6.1.2System Breaks

The maximum number of interrupts, halts, or other system breaks which may occur because of non-test conditions.

4.6.2Test Data Reduction

Describe the technique to be used for manipulation of the raw test data into a form suitable for evaluation, if applicable. The available techniques may include:

Manual collection and collation of system test output into test sequence order, followed by verification of the results. Automatic inspection of test results as obtained by data recording means using a test data reduction program followed by manual inspection of selected test results which do not lend themselves to complete reduction by automatic means. Automatic inspection of test results specifically recorded for manipulation by the test data reduction program. Test results as recorded, include all items of test significance. The test data reduction program contains an image of correct data output for an item by item comparison of data, and provides a summary of an evaluated test as output.5.0TEST DESCRIPTION

5.0 TEST DESCRIPTION

This section provides a description of the tests. Each test should be under a separate section header, 5.1- 5.x.5.x[Test Identifier]

Provide a test name and identifier here for reference in the remainder of the section. Describe the test to be performed.

5.x.1System Functions

Provide a detailed list of the system and communications functions to be tested.

5.x.2Test/Function Relationships

Provide a list of the tests that constitute the overall test activity. Include a test/function matrix summarizing the overall allocation of the system tests to the functions.

5.x.3Means of Control

Indicate whether the test is to be controlled by manual, semiautomatic, or automatic means.

5.x.4Test Data

Identify any security considerations in each of the following subsections.

5.x.4.1Input Data

Describe the manner in which input data are controlled in order to test the system with a minimum number of data types and values, exercise the system with a range of bona fide data types and values that test for overload, saturation, and other worst case effects, and exercise the system with bogus data and values that test for rejection of irregular input.

5.x.4.2Input Commands

Describe steps used to control initialization of the test; to halt or interrupt the test; to repeat unsuccessful or incomplete tests; to alternate modes of operation as required by the test; and to terminate the test. Include graphic representation if appropriate.

5.x.4.3Output Data

Identify the media and location of the data produced by the tests. Describe the manner in which the output data are analyzed in order to: detect whether an output is produced; evaluate output as a basis for continuation of the test sequence; and evaluate the test output against the anticipated output to assess system performance.

5.x.4.4Output Notification

Describe the manner in which output notifications (messages output by the system concerning status or limitations on internal performance) are controlled in order to:

Indicate readiness for the test

Provide indications of irregularities in input test data or test database because of normal or erroneous test procedures

Provide indications of irregularities in internal operations on test data because of normal or erroneous test procedures

Provide indications on the control, status, and results of the test as available from any auxiliary test software

5.x.5Test Procedures

5.x.5.1Procedures

Describe the step-by-step procedures to perform each test.

5.x.5.2Setup

Describe or refer to standard operating procedures that describe the activities associated with setup of the computer facilities to conduct the test, including all routine machine activities.

5.x.5.3Initialization

Itemize, in test sequence order, the activities associated with establishing the testing conditions, starting with the equipment in the setup condition. Initialization may include functions such as:

Readout of control function locations and critical data from indicators and storage locations for reference purposes

Queuing of data input values for the test

Queuing of test support software

Coordination of personnel actions associated with the test

5.x.5.4Preparation

Describe, in sequence, any special operations such as:

Inspection of test conditions

Data dumps

Instructions for data recording

Modifications of the data base

Interim evaluation of test results

5.x.5.5Termination

Itemize, in test sequence order, the activities associated with termination of the test, such as:

Recording readouts and critical data from indicators for reference purposes

Termination of operation of time-sensitive test support software and test apparatus

Collection of system and operator records of test results

_1018437323.doc