Strategi dan Teknik Testing
Testing Implementasi Sistem
Pertemuan 2
Oleh :Rifiana Arief, SKom, MMSI
2
Outline
• What Testing is• Testing In Development Process• Types Of Testing and Definitions• Verification & Validation• Purpose and Goal of Testing• Who Tests Software• Testing Technique• Testing Step• Testing Strategy
3
TF
A=2 ?
A=A + 0.1
A=0
Print A Print A
What’s Wrong?
4
What testing is
”Testing is a technical investigation of a product, done to expose quality-related information.”
”Testing is a technical investigation of a product, done to expose quality-related information.”
Testing is to execute a program with the purpose of finding
defects
Testing is to execute a program with the purpose of finding
defects
1) Common definition
2) Wider definition
testing
5
Testing in Development Process
• Testing activities take place in all parts of software development
• From requirement eliciting to final shipment• Testing is part of the development process• Testing is part of the company business
process
6
Testing in Development Process
• Testing During implementation: test to verify that software behaves as intended by the designer.
• Testing After implementation: test for conformance with requirements and reliability, and other non functional requirement
Most Common Software problems
Incorrect calculation Incorrect data edits & ineffective data edits Incorrect matching and merging of data Data searches that yields incorrect results Incorrect processing of data relationship
Incorrect coding / implementation of business rules
Inadequate software performance
Confusing or misleading data Software usability by end users &
Obsolete Software Inconsistent processing Unreliable results or performance Inadequate support of business needs Incorrect or inadequate interfaces
with other systems Inadequate performance and security
controls Incorrect file handling
Types of testing and definitions
• Validation and Verification– Validate
• correctness or suitability• vertical experts to confirm master results
– Verification• confirm software operates as it is required to• double check to ensure results match those previously
validated and if not then re-validate them
Maintenance
Testing
Development
Design
Analysis
Requirements
TransitionConstructionInception Elaboration
Cor
e W
orkf
low
PhaseRational UnifiedProcess (RUP)
Testing can take place as part of each phase of development .
11
Maintenance
Testing
Development
Design
Analysis
Requirements
TransitionConstructionInception Elaboration
Cor
e W
orkf
low
PhaseRational UnifiedProcess (RUP)
Tes
ting
can
take
pla
ce a
s pa
rt o
f ea
ch c
ore
wor
kflo
w in
volv
ed in
dev
elop
men
t or
gani
zati
on.
Verification & Validation
• Software V & V defined as a systems engineering methodology to ensure that quality is built into the software during development.
• Software V & V is complementary to and supportive of quality assurance, project management, systems engineering, and development.
13
• Verification & Validation– a process that establish the existence of
defects in a system
• Debugging– a process that locates and corrects these
defects
Verification & Validation versus Debugging
14
• Software Verification Process– is a process for determining whether the software
products of an activity fulfill the requirements or conditions imposed on them in the previous activities.
• Software Validation Process– is a process for determining whether the
requirements and the final, as-built system or software product fulfills its specific intended use.
Verification versus Validation
15
• Verification:– “Are we building the system in the right way?”– The system should conform to the specification– It does what you specified it should do
Validation: “Are we building the right system? ” The system should do what the users really
requires
Verification versus Validation
16
• Sometimes one of these word is used to mean both verification and validation:
– Verification in the meaning: –verification and validation, or
– Validation in the meaning: –verification and validation
Verification versus Validation
• There are two principal objectives:
– To discover and rectify defects in a system
– To assess whether or not the system is usable in an operational situation.
The V & V Objectives
18
• Software V & V determines that the software performs its intended functions correctly.
• Ensure that the software performs no unintended functions
• Measure and assess the quality and reliability of software.
The V & V Objectives
19
• As a software engineering discipline, software V & V also assesses, analyzes, and tests the software on
– how it interfaces with systems elements
– Influences the performance, or reacts to stimuli from system elements
The V & V Objectives
20
• V & V Is a whole life-cycle process
• V & V should be applied at each stage in the software process.
The V & V process
21
Formalspecification
High-leveldesign
Requirementsspecification
Detaileddesign
Code/Program
Prototype
Static and Dynamic V&VCheck correspondence between a program and its specification
Are we buildingthe system
In the right way?
Are we buildingthe right system?
StaticVerification
DynamicValidation
Execution base testing
22
• Static Verification Concerned with analysis of the static
system representation to discover problems– Analysis of all documents produced
that represent the system– Can be applied during all stages of the
software process
Static and Dynamic V&V
23
V & VV & V
Inspect artifacts
Dynamic = “testing”
Execute systems
to discover problems(static verification)
observing product behaviour(dynamic validation)
Static
24
V & VV & V
Complements each other
Inspect artifacts
Dynamic = “testing”
Execute systems
Static
25
V & VV & V
StaticStatic Dynamic = ”Testing”
Dynamic = ”Testing”
ReviewReview InspectionInspection WalkthroughWalkthrough
Unit testUnit test Integrationtest
Integrationtest
Acceptancetest
AcceptancetestSystem test System test
Static verification• Review (desk checking)
– Code reading done by a single person, – informal.– Uneffective compared to walkthrough or
inspection
• Walkthrough– The programmer(s) ”walks through”/”executes”
his code while invited participants ask questions and makes comments.
– Relatively informal
• Inspection– Usually a checklist of common errors is used to
compare the code against.
26
27
Purpose and goal of testing are situation dependent
1. Find defects2. Maximize bug count3. Block premature product releases4. Help managers make ship/no-ship
decisions5. Assess quality6. Minimize technical support costs
28
7. Conform to regulations8. Minimize safety-related lawsuit risk9. Assess conformance to specification10.Find safe scenarios for use of the
product (find ways to get it to work, in spite of the bugs)
11.Verify correctness of the product12.Assure quality
Purpose and goal of testing are situation dependent
29
13.Testing cannot show the absence of errors, only their presence
14.We test a program to find the existence of an error
15.If we find no errors then we have been unsuccessful
16.If an error is found debugging should occur
Purpose and goal of testing are situation dependent
30
Unsuitable objectives with testing
Showing that a system is without errors
Show that a system does what it is supposed to do
Testing Levels
•Unit testing
•Integration testing
•System testing
•Acceptance testing
Unit testing
• The most ‘micro’ scale of testing.
• Tests done on particular functions or code modules.
• Requires knowledge of the internal program design and code.
• Done by Programmers (not by testers).
Srihari Techsoft
Unit testing
ObjectivesObjectives To test the function of a program or unit of To test the function of a program or unit of code such as a program or modulecode such as a program or module
To test internal logicTo test internal logic To verify internal designTo verify internal design To test path & conditions coverageTo test path & conditions coverage To test exception conditions & error To test exception conditions & error
handlinghandling
WhenWhen After modules are codedAfter modules are coded
InputInput Internal Application DesignInternal Application Design Master Test PlanMaster Test Plan Unit Test PlanUnit Test Plan
OutputOutput Unit Test ReportUnit Test Report
Srihari Techsoft
WhoWho DeveloperDeveloper
MethodsMethods White Box testing techniquesWhite Box testing techniquesTest Coverage techniquesTest Coverage techniques
ToolsTools DebugDebugRe-structureRe-structureCode AnalyzersCode AnalyzersPath/statement coverage toolsPath/statement coverage tools
EducationEducation Testing MethodologyTesting MethodologyEffective use of toolsEffective use of tools
Incremental integration testing
Continuous testing of an application as and when a new functionality is added.
Application’s functionality aspects are required to be independent enough to work separately before completion of development.
Done by programmers or testers.
Srihari Techsoft
Integration Testing
– Testing of combined parts of an application to determine their functional correctness.
– ‘Parts’ can be
• code modules
• individual applications
• client/server applications on a network.
Types of Integration Testing
»Big Bang testing
»Top Down Integration testing
»Bottom Up Integration testing
Srihari Techsoft
Integration testing
ObjectivesObjectives To technically verify proper To technically verify proper interfacing between modules, and interfacing between modules, and within sub-systemswithin sub-systems
WhenWhen After modules are unit testedAfter modules are unit tested
InputInput Internal & External Application Internal & External Application DesignDesign
Master Test PlanMaster Test Plan Integration Test PlanIntegration Test Plan
OutputOutput Integration Test reportIntegration Test report
Srihari Techsoft
WhoWho DevelopersDevelopers
MethodsMethods White and Black Box White and Black Box techniquestechniquesProblem / Problem / Configuration Configuration ManagementManagement
ToolsTools DebugDebugRe-structureRe-structureCode AnalyzersCode Analyzers
EducationEducation Testing MethodologyTesting MethodologyEffective use of toolsEffective use of tools
System Testing
ObjectivesObjectives To verify that the system components perform To verify that the system components perform control functionscontrol functions
To perform inter-system testTo perform inter-system test To demonstrate that the system performs both To demonstrate that the system performs both
functionally and operationally as specifiedfunctionally and operationally as specified To perform appropriate types of tests relating To perform appropriate types of tests relating
to Transaction Flow, Installation, Reliability, to Transaction Flow, Installation, Reliability, Regression etc.Regression etc.
WhenWhen After Integration TestingAfter Integration Testing
InputInput Detailed Requirements & External Application Detailed Requirements & External Application DesignDesign
Master Test PlanMaster Test Plan System Test PlanSystem Test Plan
OutputOutput System Test ReportSystem Test Report
WhoWho Development Team and UsersDevelopment Team and Users
MethodsMethods Problem / Configuration Problem / Configuration ManagementManagement
ToolsTools Recommended set of toolsRecommended set of tools
EducationEducation Testing MethodologyTesting MethodologyEffective use of toolsEffective use of tools
Systems Integration Testing
ObjectivesObjectives To test the co-existence of products and To test the co-existence of products and applications that are required to perform applications that are required to perform together in the production-like operational together in the production-like operational environment (hardware, software, network) environment (hardware, software, network)
To ensure that the system functions together To ensure that the system functions together with all the components of its environment as a with all the components of its environment as a total systemtotal system
To ensure that the system releases can be To ensure that the system releases can be deployed in the current environmentdeployed in the current environment
WhenWhen After system testingAfter system testing Often performed outside of project life-cycleOften performed outside of project life-cycle
InputInput Test StrategyTest Strategy Master Test PlanMaster Test Plan Systems Integration Test PlanSystems Integration Test Plan
OutputOutput Systems Integration Test reportSystems Integration Test report
WhoWho System TestersSystem Testers
MethodsMethods White and Black Box techniquesWhite and Black Box techniquesProblem / Configuration Problem / Configuration ManagementManagement
ToolsTools Recommended set of toolsRecommended set of tools
EducationEducation Testing MethodologyTesting MethodologyEffective use of toolsEffective use of tools
Acceptance Testing
ObjectivesObjectives To verify that the system meets To verify that the system meets the user requirementsthe user requirements
WhenWhen After System TestingAfter System Testing
InputInput Business Needs & Detailed Business Needs & Detailed RequirementsRequirements
Master Test PlanMaster Test Plan User Acceptance Test PlanUser Acceptance Test Plan
OutputOutput User Acceptance Test reportUser Acceptance Test report
WhoWho Users / End UsersUsers / End Users
MethodsMethods Black Box techniquesBlack Box techniquesProblem / Configuration Problem / Configuration ManagementManagement
ToolsTools Compare, keystroke capture & playback, Compare, keystroke capture & playback, regression testingregression testing
EducationEducation Testing MethodologyTesting MethodologyEffective use of toolsEffective use of toolsProduct knowledgeProduct knowledgeBusiness Release StrategyBusiness Release Strategy
Testing Technique
• Two views on Software testing:
White Box Testing
Black boxBlack Box Testing
Testing Technique
White box testing - tests what the program does.Test sets are developed by using knowledge of the algorithms, data structures, and control statements.
Testing Technique
Black box
Black box testing - tests what the program is supposed to do.Test sets are developed and evaluated solely on the specification. There is no knowledge of the algorithms, data structures, or control statements.
White-box testing
Also known as: Structure based (Structural) testing Code based testing Glass box testing Clear box testing Logic driven testing
White-box testing
• White-box (or Structural) testing:– Use knowledge of the program to derive
test cases to provide more complete coverage
– Problem: What criteria to use?
White-box testing
... our goal is to ensure that all ... our goal is to ensure that all
Statements, decisions, conditions, and paths have Statements, decisions, conditions, and paths have
been executed at least once ...been executed at least once ...
White-box testing
The system is looked upon as an open box. The test cases is based on the internal structure of
the system (code) Theoretically desirable but impossible and
insufficient goal: all paths of the code exercise
Black-box testing
• Also known as:– Functional Testing
• because it test all the functions– Behavioral Testing
• because the program is tested against the expected behavior (described by requirements and/or design)
Black box
Black-box testing
– The software is viewed as a black box which transforms input to output based on the specifications of what the software is supposed to do.
requirements
input
events
output
Black-box testing
– Check The Conformity of the tested S/W against established behaviour, and
– Detect errors generated by fault• Software fault is a software part which is not
according to its definition provided in the development document
Black-box testing
– Functional tests examine the observable behavior of software as evidenced by its outputs without reference to internal functions.
– If the program consistently provides the desired features with acceptable performance, then specific source code features are irrelevant.
Black-box testing
– Should consider only from the standpoint of its:• Input data• Output data
– Knowledge of its internal structured should not be
– It is very often impossible to test all the input data
– It is hence necessary to select a subset of possible input
Testing Steps
Unittest
Unittest
Unittest
Integrationtest
Functiontest
Performancetest
Acceptancetest
Installationtest
Unit
code
Unit
code
Unit
code
.
.
.
Integratedmodules
Functioningsystem
Verified,validatedsoftware
Acceptedsystem
SYSTEMIN USE!
Designspecifications
Systemfunctional
requirements
Othersoftware
requirements
Customerrequirementsspecification
Userenvironment
Testing StepsAcceptance Test
type of acceptance testing performed by customer at the developer’s site is usually called alpha testing
software
developer site
customertests
Testing Steps Acceptance Test
beta testing is a type of acceptance testing involving a software product to be marketed for use by many users
selected users receive the system first and report problems back to the developer
users enjoy it - usually receive large discounts and feel important
developers like it - exposes their product to real use and often reveals unanticipated errors
software
customer site
customer tests
Testing Strategy
BigBang!
Top-down
Bottom-up
non-incremental incremental
SandwichCompromise
Testing Strategy
Big bang integration (all components together)
Bottom up integration (from lower levels No test stubs necessary)
Top down integration (from higher levels no test drivers are
needed)
Sandwich testing (combination of bottom-up and top-down no test stubs and drivers needed)
2.What you should test?
•Kualitas dan Resiko Kualitas
Kualitas Perangkat Lunak Seperti yang telah dijelaskan pada Pertemuan
Pertama, tujuan dari pengujian perangkat lunak adalah untuk mendapatkan perangkat lunak dengan kualitas yang sesuai dengan rancangan yang telah dibuat (quality of conformance).
Dengan kata lain, pengujian perangkat lunak merupakan cara untuk menentukan kualitas suatu produk perangkat lunak.
Defining Quality "features [that] are decisive as to product
performance and as to 'product satisfaction' ...
freedom from deficiencies... [that] result in complaints, claims, returns, rework and other damage
Defining Quality (cont.) the users and customers become the arbiters
of quality when they experience product dissatisfaction - and then make complaints, return merchandise, or call technical support.
Testing looks for situation in which a product fails to meet customers' or users' reasonable expectations in specific areas.
Standar Kualitas Perangkat Lunak Standar internasional yang digunakan untuk
mengevaluasi kualitas perangkat lunak adalah ISO 9126 yang mendefinisikan karakteristik perangkat lunak yang berkualitas.
Karakteristik Kualitas Perangkat Lunak
The standard is divided into four parts which address, respectively, the following subjects: quality model; external metrics; internal metrics; and quality in use metrics.
The quality model established in the first part of the standard, ISO 9126-1, classifies software quality in a structured set of characteristics and sub-characteristics as follows:
Functionality - A set of attributes that bear on the existence of a set of functions and their specified properties. The functions are those that satisfy stated or implied needs.
Suitability
Accuracy
–Interoperability
Compliance
–Security
• Reliability - A set of attributes that bear on the capability of software to maintain its level of performance under stated conditions for a stated period of time.
Maturity
Recoverability
–Fault Tolerance
• Usability - A set of attributes that bear on the effort needed for use, and on the individual assessment of such use, by a stated or implied set of users.
Learnability
Understandability
–Operability
• Efficiency - A set of attributes that bear on the relationship between the level of performance of the software and the amount of resources used, under stated conditions.
Time Behaviour
Resource Behaviour
• Maintainability - A set of attributes that bear on the effort needed to make specified modifications.
Stability
Analyzability
Changeability
–Testability
• Portability - A set of attributes that bear on the ability of software to be transferred from one environment to another.
Installability
Replaceability
Adaptability
Who Tests Software?
71
user
independent testerindependent testerdeveloperdeveloper
Who Tests Software?
• User– Test while using it
• It’s not in purpose to do so
– Indirect test
72
Who Tests Software?
• Software Developer– Understand system– Test gently– Driven by delivery
• Independent Tester– Doesn’t understand system– Will try to break it– Quality driven
73