1. Big bang model of testing -Concentrates on each …..something like dis -concentrates on each model… something kike dis 2. Test plan captures Different milestones 3. A manual testing technique in which the author presents/tells the logic code and programe module to a group of persons /colleagues, error detection but not correction Walkthrough 4. A technique in which of persons discuss for creatie and !something solution ".#efect is deviation from standards$.%ocalisation testing is Target particular locale/culture Target any locale/culture &.'hat are testing actiities inoled in design phase (.The testing that inoled internal logic/code white bo. ).Blac* bo+ testing chec*s the re!uirements 1.-ne more question on white bo+ testing! 11.oot cause analsis is done in acceptance testing. 12.Testing team should hae #omain *nowledge #ata base *nowledge Tools 0+pertise "ll 13.'hich is not done btest manager test eecution 14.Test case prioritdepends upon ustomer priorittabilitif functions mpacted functionalit1".on5statistical technique of testing 6istogram 7lowchart catter chart un chart 1$.generallto chec* to memoroerflow stress testing can be used
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
t is statement of oerall approach of testing to meet the business and testob9ecties . it identifies the methods , techniques and tools to be used for testing.
A test strateg will tpicall coer the following aspects
#efinition of test ob9ectietrateg to meet the specified ob9ectie
-erall testing approach
Test enironment
Test automation requirements
<etric plan
is* identification, <itigation and contingenc plan
#etails of tools usage
pecific #ocument templates used in testing
23. 6ow identif the test case:
Test case %D t is unique number gien to test case in order to be identified.
24. Test e+ecution consist of following actiities to be performed
reation of test setup or testbed0+ecution of test cases on the setup
Test methodolog used
ollection of metrics
#efect trac*ing and reportingegression testing
2". There are two approaches in integration testing
Top5down>stubs?
Bottom5up>>driers?
2$. ntegration testing;
Tests partial sstems composed of integrated components.
1. Preparation for the review meeting is mandatory in which form of review.a. Code Inspectionb. Code Walkthroughc. Technical Review
d. None of the above. !uthor shall not act as """""""""""" in Code #nspection.
a. #nspection $eaderb. Reader c. Both a and bd. None of the above
%. Code #nspection team can have minimum """""""" no. of participants.a. 3b. &c. d. '
(. Who should not participate in the Code #nspection meeting)a. Testing Team $eader b. Program Managerc. *oth a and bd. None of the above
+. What is true about Comparison Testing)a. Run all versions in parallel with a real-time comparison of results.b. Run different versions one by one with a real,time comparison of results.c. -se same teams to develop independent versions of the software.
d. Test each version with different test data
&. pecification based testing is aa. Testing techniue b. Test trategyc. Test case designd. None of the above
/. 0ollowing is NT a testing techni2uea. pecification basedb. Code,basedc. Bread testingd. pecific techni2ue
'. Client3erver Testing is aa. pecification based Test Techni2ueb. Code,based Test Techni2uec. Thread testing Test Techni2ued. !omain "pecific test techniue
4. *lack bo5 test design is also called asa. Re2uirement,based test designb. 0unctional test designc. Both a and bd. None of the above
16. The purpose of Re2uirement,based testing is only toa. 0ind incorrect or missing functionalityb. 7radicate Performance errorsc. 0ind presence of #nitiali8ation and termination errorsd. #ll of the above
11. Re2uirement,based testing guarantees testing againsta. Positive testingb. Negative testingc. -se case testingd. #ll of the above
1. 0ollowing is NT a re2uirement,based test case design techni2uea. 72uivalent Partitioningb. Branch Coveragec. *oundary 9alue !nalysisd. Cause 7ffect :raphing
1%. ! possible programming element to determine e2uivalent input classa. Rangeb. 9aluec. a or bd. nly a
1(. ;ividing the input domain in to classes of data to arrive at test cases is aa. Code based techni2ueb. Reuirement based techniuec. *oth a and bd. None of the above
1+. When a programming element has valid value and valid length< to derive test cases were2uire techni2ue
a. 72uivalence Class Rangeb. $uivalence Class %aluec. 72uivalence Class etd. 72uivalence Class *oolean
1&. Complementary techni2ue of *oundary 9alue !nalysis isa. $uivalent Partitioningb. Cause 7ffect :raphingc. *ranch coverage analysisd. Cyclomatic Comple5ity
1/. Possible tests for a range of values bounded by a and b area. =a,1>< a< =a?1>
b. =b,1>< b< =b?1>c. a and bd. a or b
1'. teps to arrive at test cases using Cause 7ffect :raphing Techni2ues area. decision table rules are converted to test casesb. ! cause@effect graph developedc. :raph converted to a decision tabled. "euence of steps as a& b& c
14. The techni2ue to test and arrive at test cases for Reliability of software isa. 72uivalent Partitioningb. Cause 7ffect :raphingc. *oundary 9alue analysisd. Comparison tests
6. White bo5 test case design is also called asa. Code-based test case designb. 0unctional Test case designc. *oth a and bd. No e2uivalent name
1. To arrive at Code,based test design the tester is ideally e5pected to havea. Aigh level designb. 'ow level designc. *oth a and bd. !rchitecture of the Product design
. The purpose of Code,based testing is only toa. 0ind typographical errorsb. 0ind logical errorsc. -nderstand the programming obBective to arrive at test casesd. #ll of the above
%. Code,based testing guarantees testing againsta. Path coverageb. Condition coveragec. yntactical and typographical errorsd. #ll of the above
(. ! re2uirement to achieve Code,based testing is to havea. Code walkthrough and Code inspection checklistb. Traceability matri5c. Both a and bd. nly a
+. 0ollowing is not a code,based test case design techni2uea. tatement coverageb. *ranch Coveragec. Boundar( %alue #nal(sisd. Condition coverage
&. 7lementary statements of a program are e5ecuted at least once ina. "tatement coverageb. *ranch Coveragec. Condition coveraged. Cyclomatic Comple5ity
'. 7very path3iterations in the control flow graph e5ecution is covered in which of thefollowing design
a. 7dge coverage
b. path Coveragec. tatement coveraged. *ranch coverage
4. ! techni2ue adopted for evaluating the comple5ity of programming logic isa. 72uivalent partitioningb. *oundary value analysisc. Comparison testingd. C(clomatic Comple)it(
%6. 0or a Control flow graph< the Cyclomatic Comple5ity is defined to bea. v*+, e - n /b. e D v=:> , n ? c. v=:> D e @ n ? (d. v=:> D n ? ? e
%1. White *o5 testing and *lack *o5 testing area. Phases of testingb. Cycles of testingc. T(pes of Testingd. Eethods of testing
%. ne of the following appropriately represents White *o5 testinga. Code level testingb. tructural testingc. Both a and bd. None
%%. White *o5 testing reports deviation froma. Coding standardsb. $ogic and Programming stylec. Comple5ity of Coded. #ll of the above
%(. Two categories of White bo5 testing area. Basic Path testing and C(clomatic Comple)it(b. Cyclomatic Comple5ity and 72uivalence partitioningc. *asic Path Testing and *oundary 9alued. a and b
%+. 7dge coverage and path coverage are measures of a. 0hite Bo) Testing #deuac( criteriab. *lack *o5 Testing !de2uacy criteriac. *oth a and bd. None of the above
%&. *oundary value analysis is included ina. White *o5 Testingb. Blac1 Bo) Testingc. *oth a and bd. None of the above
%/. !nswer to the testerFs 2uestion, GAave # found all seeded errorsH isa. 0hite Bo) Testingb. *lack*o5 Testingc. *oth a and bd. None of the above
%'. !nswer to testerFs 2uestion GAave # applied all the inputsH isa. White *o5 Testingb. Blac1 Bo) Testingc. *oth a and bd. None of the above
%4. Test Preparation Checklist should contain one or more of the followinga. Correct number and version of the system under testb. Test #nput data readiness
c. Readiness of the test environmentd. #ll of the above
(6. Testing 7mbedded ystem is to test3confirma. nly oftwareb. nly hardwarec. #nterfacing with hardware and softwared. #ll of the above
(1. -nit testing with embedded system is achieved througha. Code walkthroughb. -sing debuggerFs scripting languagec. -sing test harnessd. #n( of the above depending on comple)it( of the s(stem
(. Which of the following is not a characteristic of 7mbedded ystem testing
a. Aighly comple5 systemb. 2on technical person can conduct testsc. Needs imulated environmentd. ;ifficult to test run time behavior
(%. ystem testing of embedded software is to testa. !ctual Aardware functionalityb. !ssume the oftware is interacting as e5pectedc. ne #nstrument and one functionality at a timed. all of the above
((. Characteristic of a Client3erver ystem isa. ave multiple processors4"erversb. Aave one processor3erver c. -niversal Client with ame d. all of the above
(+. Iey areas< a tester need to know while testing #nternet and web application area. *rowser Compatibilityb. 0unctional correctnessc. -sability and ecurityd. #ll of the above
(&. Primary test concerns for #nternet presence is to test applicationsa. Correctnessb. -sabilityc. *rowser Compatibilityd. #ll of the above
(/. 0ollowing is a Type of #nternet and Web !pplication Testinga. Component and #ntegration testingb. ystem testingc. -!Td. #ll of the above
('. Compatibility and Navigation testing are carried out duringa. Component testingb. Integration testingc. ystem testingd. -!T
(4. Which one of the following is an activity of Eicro Processa. Plan& !esign& !evelop& Review and $)ecute Test Casesb. Plan and 75ecute Test casesc. Review test casesd. 75ecute test cases
+6. *est method to conduct ystem level testing isa. Re2uirement based testingb. #ntegration testing
c. 5se case based testingd. Code based testing
+1. Which attributes are used whenever a *ug3;efect is detecteda. 7rror J faultb. ;efect J *ugc. "everit( 6 Priorit(d. poradic J Reproducible
+. Which of the following is relevant if a *-: is in its deferred statea. !eveloper will loo1 into it in ne)t release.b. Tester has identified a wrong bug.c. ;eveloper will not work on this bug.d. *ug will be resolved immediately
+%. ! *ug is a poradic bug if it
a. Can be reproduced accurately.b. Cannot be reproduced accuratel(.c. is not 2ualified.d. it is 0i5ed
+(. oftware test plan should consist of a. oftware Test 7nvironmentb. Test identificationc. Test chedulesd. #ll the above
++. ystem verview briefly states thea. Purpose and nature of the systemb. The software to which this document appliesc. Types or classes of tests that will be performedd. a 6 b
+&. ystem verview summari8es history of a. system development<b. operation and maintenance.c. None of the aboved. Both a 6 b
+/. """"""""""" shall summari8e the purpose and contents of the Test Plan documentand shall describe any security or privacy considerations associated with its use.
a. ystem overviewb. Referenced ;ocumentsc. !ocument 7verviewd. None of the above
d. 7nl( a&c+4. Procedures for manual< automatic and semi,automatic techni2ues for recording test
results is defined ina. Test $evelsb. Test Classesc. Test Progressiond. !ata Recording& Reduction& #nal(sis
&6. ! listing or chart depicting the sites at which the testing will be scheduled and the timeframes during which the testing will be conducted is described in
a. Test 7nvironmentb. Test "chedulingc. Test #dentificationd. Test $evels
&1. Code inspection and Code walk through fall under which type of testinga. ;ynamic testingb. "tatic testingc. ystem Testingd. None of the !bove
&. Reader in code inspection is the person whoa. Aas generated the codeb. 0ho runs through the code line b( linec. uggest solution to the anomaliesd. Aas sound technical knowledge
&%. Person who runs through the code in Code Walk through isa. Walkthrough leader b. Reader
c. Team membersd. Eoderator &(. Check list is used in code inspection to find
a. 0aultsb. :a8e a developer c. ;eviation from coding standardsd. Both a 6 c
&+. Work products that undergo reviews area. ystem buildb. Release notesc. ource coded. #ll of the above
&&. ! technical review is aa. 'ess formalb. 0ully 0ormalc. Can be done without a sound technical persond. #s helpful only for automation testing
&/. ;uring code inspection the a. chec1 list is reuired
b. Eanager should participate.c. Code is e5ecuted to find faultsd. Eeeting could e5tend for a full day
&'. !s per :-# standards Title names of the menu bar must begin witha. -pper case letter followed by lower case letterb. -nder line the character that represent hotkeyc. nly upper cased. a 6 b
&4. !n author in code inspection cannot play the role ofa. Readerb. #nspector c. Recorder d. #ll of the above
/6. Re2uirements are collected bya. ystem !nalystb. Kuality manager c. ubBect Eatter 75pertd. a 6 c
/1. Complete preparation on code prior to review is mandatory for a. Code Inspection b. Code Walk Through
c. Technical Reviewd. !ll of the above
/. ! good :-#a. Eust be comple5b. :rouped unrelated components together c. Balance the displa( of items on screend. None of the above
/%. Which of the following is most appropriate feature of Eanual testinga. 7asy to repeatb. Can run unattendedc. !lways reliabled. Time consuming
/(. Which of the following is most appropriate feature of !utomation testing
a. Reliableb. -nable to simulate all scenariosc. Cannot run unattendedd. ;ifficult to report
/+. oftware test automation refers to activities and efforts that intend to automatea. Processb. Tasksc. $ngineering Tas1sd. *oth a J b
/&. elect the best option for the Eanual Testing from the followinga. !lways reliableb. Not costlyc. Certain scenarios practically possibled. 2one of the above
//. elect the best option for the !utomation Testing from the followinga. Not able to generate scenariosb. Not reliablec. Not Repeatabled. +ood Return on Investment if properl( planned.
/'. The main obBective of !utomation testing processa. To completely eliminate the manual testing processb. Not to achieve better test coveragec. To speed up a software testing processd. None of the above
/4. 0irst phase in the !utomation testing process life cycle isa. ;esign Test !utomation strategies and solutionsb. elect and evaluate available Test automation toolc. ;evelop and implement test automation solutionsd. Plan for "oftware test automation.
'6. White bo5 testing tools testsa. 0unctionality of the softwareb. Performance of the softwarec. -ser acceptance criteriad. Coding standards and memor( lea1s
'1. Which of the following is White bo5 testing toolsa. WinRunnerb. !stra Kuick testc. K! Rund. Rational purifier and Rational uantifier
'. 0unctionality Testing tools verifies for thea. Code coverageb. Parameteri8ation
c. Code comple5ityd. 2one of the above
'%. *atch Testing in the 0unctionality testing tools is used for a. 75ecute single testb. -sed to parameteri8ec. 5sed to e)ecute group of test scripts.d. -sed to handle errors
'(. 9erification points are also called asa. Parameteri8ationb. 7rror Aandlingc. Chec1 pointsd. None of the above
'+. Eain obBective of the Web Testing is one of the following
a. Test for the only functionalityb. Test for the bro1en lin1sc. Test for the memory leaksd. Test for the code coverage
'&. Which of the following is 0unctionality test automation tool.a. $oad Runnerb. Rational purifierc. Rational robotd. None of the above
'/. Test management tools address one of the following belowa. 0unctionalityb. Performancec. !efects trac1ing and Reuirements gatheringd. None of the above
''. Which of the following is a test management toola. Win runnerb. Kuick test professionalc. K! Rund. Rational test manger
'4. Response time and capacity both area. ;irectly proportional to each otherb. 72ual to each otherc. Inverse proportional to each other d. None of the above
46. -se cases mainly consists of a. Transitionsb. #ctorsc. 7ventsd. None of the above
41. 75pand the word !P#a. !utomated programming interfaceb. #pplication programming interfacec. !utomated performance interfaced. None of the above
4. 75pand the word #;7a. #nternational ;evelopment 7nvironmentb. Integrated !evelopment $nvironmentc. #nternational ;elivery 7nvironmentd. None of the above
4%. Performance of the web sites are tested using which techni2uea. Eanual testingb. #utomation testing
c. -nit Testingd. #ntegration Testing
4(. Web based applications are developed under which """"""""""""$anguagea. tructured programming languageb. 7b8ect 7riented Programming languagec. cripting languagesd. None of the above
4+. calability comes under which category of testinga. 0unctionalb. 2on 9 :unctionalc. Regressiond. -nit Testing
4&. G9erify system from user perspectiveH this statement is appropriate to which type of
Testing a. 0ield Testb. Regression Testc. Performance Testd. #cceptance Test
4/. Which of the one mentioned below can be the correct definition for 0ield Testa. verif( that the s(stems wor1 in actual user environment.b. 9erification from the userFs perspectivec. *oth a J cd. The purpose is to verify the system meets the performance re2uirements.
4'. GPilot system should work during a problemH< is an e5pectation made ina. #nterface Testb. Performance Testc. !cceptance Testd. :ield Test
44. cope of the ProBect is mentioned ina. Test 7nvironmentb. Traceability Eatri5c. Test Pland. Test chedule
166. #dentification of numbers< titles< abbreviations< version numbers and release numbersare mentioned in
a. Test scheduleb. "copec. Test 7nvironmentd. Test Plan
161. Which document covers the -nit re2uirements in all applicable oftwareRe2uirements pecifications
a. Test Caseb. Test #dentification
c. Traceabilit( Matri)d. Test ;esign
16. Preparation of review< and approval of the oftware Test Report=TR> is done duringa. Test 7nvironmentb. Test "chedulingc. Test #dentificationd. Test Cases ;esign
16'. !utomated testing tools generates LLL..scriptsa. Non,Reusable scriptsb. Re,usable scriptsc. Reliable scriptsd. Both b 6 c
164. Capacity under the performance testing deals witha. !mount of timeb. Concurrent usersc. $oadd. Performance
116. Throughput under the performance deals witha. #mount of time b. Concurrent usersc. $oadd. Performance
111. Response time under the performance testing deals witha. Time ta1en for the server to respond to the client reuest.b. Time taken for the client to respond to the server.c. !ll of the aboved. None of the above
11. !utomation 0rame work should be LLLL. f Toolsa. ;ependentb. Independentc. *oth of the aboved. None of the above
11%. Win Runner uses which type of the scripting languagea. Perlb. TC$c. 9*d. T"'
11(. Who will be contacted3used as ubBect matter 75pertsa. !omain $)pertb. ProBect manager
114. oftware Configuration includesa. "oftware Reuirements "pecification& a !esign "pecification and source
code.b. oftware Re2uirements pecification< a Test Plan and source code.c. Test Plan< a ;esign pecification and source code.d. oftware Re2uirements pecification< a ;esign pecification and Test
Plan16. ! test configuration includes
a. Test Plan and Procedures< ;esign pecification and testing tools.b. Test Plan and Procedures& test cases and testing toolsc. Test Plan and Procedures< test cases and source coded. Test Plan and Procedures J test cases only
11. #t is difficult to schedule the Test 0low becausea. #t is difficult to predict the time re2uired to analy8e the re2uirementsb. It is difficult to predict the time to debug the codec. #t is difficult to predict the time re2uired for test pland. #t is difficult to predict the time re2uired to report the *ugs
1. The details of how the testing team will evaluate the work products< systems andtesting activities and results are mentioned in
a. Test 75it Criteria
b. Test Casesc. Test "trateg(d. Test 7nvironment
1%. The approach to all testing types and phases and the activities for which they areresponsible are mentioned separately in
a. Test Planb. Test "trateg(c. Traceability Eatri5d. Test Cases
1(. Test trategy for Eaintenance includes a greater focus on a. 0unctional Testingb. Regression Testingc. Performance Testing
d. 0ield Testing1+. The main inputs for Test trategy are
a. Priorit( 6 criticalit(b. Test Participationc. Test 7nvironmentsd. 9erification of -n,testable Re2uirements.
1&. Test taffing< Testing of CT< Poor Re2uirements are some of the e5amples fora. Test strategyb. Test related issuesc. Test Pland. Test ;esign
1/. pecification based testing is also called asa. White bo5 testing
b. Blac1 bo) testingc. Code *ased Testingd. tatic Testing
1'. Which are the ones clubbed under Testing Techni2uesa. 5sage based Testingb. Test issuesc. Test trategyd. Test #dentification
a. Test #dentificationb. tudy *usiness Re2uirements< !rrive at Aigh $evelc. ;ecide !utomation Re2uirementsd. Reuirements of Test #pproach& Identif( the features to be tested& #rrive
at igh 'evel1(6. tudy *usiness Re2uirements< !rrive at 7nvironmental Re2uirements are the
""""""""""followed in Test ;esign pecificationa. #pproachb. Contentsc. Purposed. Coverage
1(1. Code based testing can derive the test cases to ensurea. !ll the bugs have been detectedb. #ll independent paths are e)ercised at least once.c. !ll the re2uirements have been metd. !ll the standards have been met
1(. !pproach of the proBect is mentioned undera. Test copeb. Test 7nvironmentc. Test Plan
At this step, the designed test automation solutions are deeloped and tested as qualittools and facilities. The *e in this step is to ma*e sure that the deeloped tools are
reliable and reusable with good documentation.
(.Actions/*ewors are generated ina.*eword drien
b.ecord and plabac*
c!.
).who builds the model to capture required behaiour and logical ariations of date and