Top Banner
W17 Performance Testing 5/1/2013 3:00:00 PM Performance Testing Web 2.0 Applications - in an Agile World Presented by: Mohit Verma Tufts Health Plan Brought to you by: 340 Corporate Way, Suite 300, Orange Park, FL 32073 888-268-8770 ∙ 904-278-0524 ∙ [email protected] www.sqe.com
13

Performance Testing Web 2.0 Applications—in an Agile World

May 22, 2015

Download

Technology

Agile methodologies bring new complexities and challenges to traditional performance engineering practices, especially with Web 2.0 technologies that implement more and more functionality on the client side. Mohit Verma presents a Scrum-based performance testing lifecycle for Web 2.0 applications. Mohit explains when performance engineers need to participate in the project, discusses how important it is for the performance engineer to understand the technical architecture, and explores the importance of testing early to identify design issues. Find out how to create the non-functional requirements that are critical for building accurate and robust performance test scenarios. Learn how to implement practices for continuous collaboration between test engineers and developers to help identify performance bottlenecks early. Learn about the tools available today to help you address the testing and tuning of your Web 2.0 applications. Leave with a new appreciation and new approaches to ensure that your Web 2.0-based applications are ready for prime time from day one.
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Performance Testing Web 2.0 Applications—in an Agile World

W17 Performance Testing

5/1/2013 3:00:00 PM

Performance Testing Web 2.0

Applications - in an Agile World

Presented by:

Mohit Verma

Tufts Health Plan

Brought to you by:

340 Corporate Way, Suite 300, Orange Park, FL 32073

888-268-8770 ∙ 904-278-0524 ∙ [email protected] ∙ www.sqe.com

Page 2: Performance Testing Web 2.0 Applications—in an Agile World

Mohit Verma

At Tufts Health Plan Mohit Verma is the lead software performance architect in charge of performance testing of enterprise applications. The environment at Tufts includes web, mobile, EDI, SOA, legacy, and proprietary applications. Mohit has been working with open source performance and functional test tools for the past fourteen years. He has implemented performance test solutions for numerous complex and high-end projects. Previously, Mohit was at John Hancock Financial Services as a performance engineer, leading the performance testing of enterprise applications across all business units. Mohit is a member of the IEEE, Computer Measurement Group, and the Workshop on Performance and Reliability.

Page 3: Performance Testing Web 2.0 Applications—in an Agile World

4/16/2013

1

Performance Engineering a

Web 2.0 Application in an

Agile world:STAREAST 2013

Mohit Verma

Tufts Health Plan

Lead Performance Architect

Company Confidential 1

Company Confidential 2

Agenda

Web 1.0 vs Web 2.0

Project Description

Technical Environment

Performance Test Strategy

Test Results

Application Improvements/Tuning

Agile vs Waterfall

Learnings/Review

Page 4: Performance Testing Web 2.0 Applications—in an Agile World

4/16/2013

2

Web 1.0 vs Web 2.0

Web 1.0 – typical static and server driven content - user has little control for optimizing content to individual needs

Web 2.0 – User Driven contents – user has many

tools to drive content through AJAX/Javascript, DOJO, YUI Toolkits, Trinidad (MyFaces),

etc

Blogs, Wikis

SecondLife, Facebook,

Twitter (social networking)

Company Confidential 3

Company Confidential 4

Project Description

� Since we are a Managed Care Provider (HMO)-

our applications typically support Health Care

Providers (hospitals, etc), Employers and our

Members. This Project supported the Customer

Service Reps and had 5 Sprints

� Sprint 1 - Member & Group functionality – 50 users

� Sprint 2 - Provider access functionality – 50 users

� Sprint 3 - Benefit functionality & updates to Member

and Provider – 150 users

� Sprint 4 - Referrals/Claims – no increase in users

� Sprint 5 – Adhoc SRs – no increase in users

Page 5: Performance Testing Web 2.0 Applications—in an Agile World

4/16/2013

3

Company Confidential 5

Technical Environment

Web Application Servers: Weblogic, WebSphere, JBOSS, Aqualogic Infrastructure

Security: CA SiteMinder

Web Server: Apache

Middleware: Tibco BusinessWorks and EDIFECS

Siebel, Lawson, Actuate,Cognos

Midrange: HP Itanium

Hardware: Linux, HP Itanium and Windows environment

Databases: Oracle and Sql-Server

EDI interfaces HIPAA 5010

Performance Test Tools: Loadrunner, Custom Built Scripts (VB), Open Source

Company Confidential 6

Performance Testing Strategy:Attend daily SCRUM meetings to identify Performance Sensitive transactions early

Pair with Lead Application Architect

Achieve a 5 sec response time 90% of the time for all App screens (SLA)

Proactively test the new functionality via Web Service calls (Member, Benefits and Claims) prior to design completion during the Sprints

Test the Member, Provider, Referral, Benefits and Claims screens prior to deployment (Sprint-Based)

Conduct Testing in production like environment /configuration if possible

Execute Stress and Endurance Tests for Web profile (ALL WEB APPS) to identify breaking point and confirm stability of the Web environment under load

Execute & Monitor sociability tests (Web + EDI)

Page 6: Performance Testing Web 2.0 Applications—in an Agile World

4/16/2013

4

Tests Executed

POC Tests:

• Benefits Web Service (200 Concurrent Users)

• Claims Web Service (200 Concurrent Users)

• Member Web Service (200 Concurrent Users)

Sprint 1:

• 50 User Test for Member & Group functionality

Sprint 2:

• 110 User Test for Member & Group & Provider

Sprint 3:

• 250 User Test for Member, Group, Provider & Benefits + ALL WEB

Sprint 4:

• 250 User Test for Member, Group, Provider, Referrals, Benefits & Claims + ALL WEB

Sprint 5:

• 250 User Test for Member, Group, Provider, Referrals, Benefits & Claims + ALL WEB

Company Confidential

POC Test Results (1 of 2) – Benefits &

Claims Web Service

Claims Transactions in RED were candidates for tuning to meet the 3 sec SLA

for the Benefit and Claims web services

200 User Test - 5/10/11 - Benefit Detail Web Service

Transaction Name SLA

Status

Minimum Average Maxi

mum

90

Percent

Pass

CMT_BenDetails 3 secs 0.032 0.058 4.163 0.07 1,990

Claims Web Service POC Test Results, 05/26/11

200 user test May 26, 2011

Transaction Name SLA

Status

Minimum Average Maxi

mum

90

Percent

Pass

CMT_ClaimsByClaimsNumber 3 secs 0.097 0.233 1.407 0.348 464

CMT_ClaimsByLargeProvider 3 secs 0.282 4.37 9.594 7.114 46

CMT_ClaimsByMemberID 3 secs 0.112 4.593 28.11 9.354 977

CMT_ClaimsBysmallProvider 3 secs 0.088 0.403 3.097 0.796 661

Company Confidential

Page 7: Performance Testing Web 2.0 Applications—in an Agile World

4/16/2013

5

POC Test Results (2) – Claims

Web Service

Company Confidential 9

CLAIMS WEB SERVICE: Sept 29th -Nov 21 Test Results, response times in seconds for 90 %tile, 200 Concurrent users

Claims Web Service Test - Sept 29- Oct 7, 2011

- Searching Independent claims by providers

Large Provider

15 days,Small

provider - 12

months - Old

Database

Server

Large Provider

15 days,Small

provider - 12

months – New

Database Server

Large Provider

15 days,Small

provider - 6

months – New

Database Server

Large Provider

15 days,Small

provider - 12

months – New

Database Server

with new claims

query returning

10 rows at a time

Large Provider

30 days,Small

provider - 12

months – New

Database Server

with new claims

query returning

10 rows at a time

Large Provider 30

days,Small

provider - 12

months – New

Database Server

with new claims

query returning

10 rows at a time

– TRINIDAD Table

Detailing Turned

off

Transaction Name 90 Percent 90 Percent 90 Percent 90 Percent 90 Percent 90 Percent

ClaimsByClaimsNumber 0.4 0.28 0.28 2.1 2.2 0.2

ClaimsByLargeProvider 11.6 7.4 4.5 4.2 13.8 5.3

ClaimsByMemberID 3.6 2.2 2.2 2.7 3.7 1.7

ClaimsBysmallProvider 15.3 6.29 3.17 3.3 9.2 2.9

Application - 250 Web User

Performance Test Results90 percent times are reflective of 90th %tile times in seconds, for the server-side component onlyNumber of Concurrent Users 250User 250User 250User 250User 250User 250User

Changes Made to Environment (CMT

Managed Servers)

4 nodes for

CMT , JVM 2

GB, Services

CMT JVM

modified -

permsize ,

No Benefits

4 nodes for

CMT , JVM 2

GB, CMT JVM

permsize

modified.

Diffe rent

UserIDs

4 nodes for

CMT , JVM 2

GB, CMT JVM

permsize

modified.

Diffe rent

Use rIDs,

Upped

memory for

T ibco Service

4 nodes for

CMT , JVM 2

GB, CMT

JVM

permsize

modified.

Diffe rent

Use rIDs,

Upped

memory for

T ibco

Service

4 nodes for

CMT , JVM 2

GB, CMT

JVM

permsize

modified.

Diffe rent

UserIDs,

Upped

memory for

Services,

T ibco

Threads

doubled.

4 nodes for

CMT , JVM 2

GB, CMT JVM

permsize

modified.

Diffe rent

UserIDs,

Upped

memory for

T ibco, T ibco

Threads

tripled for

T ibco

Services

May 29 - July 15, 2011 29-Jun 7-Jul 11-Jul 12-Jul 14-Jul 15-Jul

T ransaction Name 90 Percent 90 Percent 90 Percent 90 Percent 90 Pe rcent 90 Pe rcent

ClickOnCoveringProviders 4.434 9.877 5.362 5.489 3.315 3.082

ClickOnDependentVerification 5.34 4.599 4.176 5.385 4.916 3.39

ClickonGroupBenefits 7.527 4.824 4.32 4.134 4.027

ClickonGroupDeductibleOOP 3.399 2.821 3.117 2.767 2.226

ClickOnGroupHistory 4.035 6.53 5.547 5.797 7.513 6.613

ClickOnGroupNameSearchButton 4.83 6.82 5.26 6.696 5.809 4.573

ClickonGroupPharmacyBenefits 3.399 2.902 3.035 2.759 2.204

ClickOnMemberHistory 5.037 11.096 7.571 7.008 5.866 5.304

ClickonMemberBenefits 17.06 9.339 9.112 8.705 8.457

ClickonMemberDeductibleOOP 9.526 7.064 9.377 8.372 6.694

ClickonMemberPharmacyBenefits 2.447 5.674 5.413 6.324 7.981 4.496

ClickOnMembersOnPlan 7.916 3.673 2.581 2.881 3.088 2.346

ClickOnMemLastNameSearch 6.61 10.426 9.935 11.391 8.663 6.496

ClickOnMemSearchButton 5.108 12.738 7.302 6.899 6.481 5.647

ClickOnNPISearch 2.423 11.692 5.985 5.975 5.023 5.23

ClickOnPrivileging 6.449 2.043 1.816 2.55 1.504 1.26

ClickOnProviderIdsearch 5.921 15.197 7.846 7.181 4.886 4.823

ClickOnProviderNameSearch 9.457 11.483 6.847 7.397 6.423 5.981

ClickOnSubscriberSearchButton 2.814 17.511 7.931 8.955 7.223 7.724

GroupProfileByGroupIDSearch 3.703 6.894 5.172 5.372 5.174 4.322

MemberProfileBySSNSearch 4.022 8.299 6.027 5.636 6.462 4.902

SelectGroupIDFromGroupNameSearch 2.828 4.001 3.606 3.543 3.491 2.697

SelectMemberIDFromSubscriberSearch 4.748 5.225 8.939 3.684 2.784 3.578

SelectMemIDFromLastNameSearchResults 6.103 6.048 3.354 4.007 2.983 2.847

SelectProviderIdFromNameResults 2.403 2.613 2.032 2.128 1.628 1.84

SelectProviderIdFromNPIResults 2.752 3.29 2.431 2.734 2.214 1.884

Page 8: Performance Testing Web 2.0 Applications—in an Agile World

4/16/2013

6

Server Side Troubleshooting

Company Confidential 11

Client Side Troubleshooting

(1)

12

Page 9: Performance Testing Web 2.0 Applications—in an Agile World

4/16/2013

7

Browser-Based

Troubleshooting

Company Confidential 13

CPU Monitoring – Unix Servers

Tufts Health Plan 14

Page 10: Performance Testing Web 2.0 Applications—in an Agile World

4/16/2013

8

Response Time Graph

Company Confidential 15

Additional Resources/Tools

for Client and Server Tuning

Manual Testers/UAT to support Web 2.0 functionality

Use UFT/QTP to emulate real-user

Dynatrace Ajax Browser – Free Version

HttpWatch

Jconsole

Desktop Engineer

Company Confidential 16

Page 11: Performance Testing Web 2.0 Applications—in an Agile World

4/16/2013

9

Tufts Health Plan 17

Client-Side Tuning – Speed of the

Web

18

Client Side Tuning - Browser Test Results- 7/7/11

IE 7 vs IE 8 with AntiVirus OptionsBrowser

Version

Antivirus Status Member Members on

Plan

Member

History

Group

History

Dependent

Verification

Coordination

of Benefits

IE 7 Version 8.5 6.499242424 3.6 7.6 9.0 4.1 4.1

IE 7

Version 8.8 with

exceptions 6.7 3.8 5.8 7.5 3.5 3.5

Difference from IE7 & 8.5 0.211868687 0.136616162 -1.815656566 -1.490909091 -0.531818182 -0.519696970

IE 8 Version 8.5 7.1 3.3 8.3 10.5 4.0 4.0

Difference from IE7 & 8.5 0.613888889 -0.311868687 0.707575758 1.537373737 -0.026767677 -0.089393939

Difference from IE7 & 8.8 0.402020202 -0.448484848 2.523232323 3.028282828 0.505050505 0.430303030

IE 8

Version 8.8 with

exceptions 6.133333333 2.873333333 4.584444444 9.937777778 3.413333333 3.664444444

Difference from IE7 & 8.5 -0.365909091 -0.745606061 -3.053434343 0.980202020 -0.651818182 -0.388585859

Difference from IE7 & 8.8 -0.577777778 -0.882222222 -1.237777778 2.471111111 -0.120000000 0.131111111

IE 8 No Anti Virus Software 5.7 2.5 5.5 6.6 2.7 3.0

Difference from IE7 & 8.5 -0.832575758 -1.152272727 -2.104545455 -2.357575758 -1.398484848 -1.053030303

Difference from IE7 & 8.8 -1.044444444 -1.288888889 -0.288888889 -0.866666667 -0.866666667 -0.533333333

Company Confidential

Page 12: Performance Testing Web 2.0 Applications—in an Agile World

4/16/2013

10

Company Confidential 19

Application Performance

Improvements - Summary

� Use HashTable instead of HashMap for ThreadSafe transactions

� Increasing threads and JVM Heap with many components

� Claim Search modifications – limiting result set size

� Limiting searches by date range

� Differentiating Small vs Large providers

� Limiting searches by row count

� AntiVirus Modifications

� Add URL Exceptions to user desktops

� Adding more virtual servers – Web and Application servers

� Trinidad (MyFaces) toolkit – Datatable Column Detailing turned off

� Fetching email asynchronously via JMS

� Move from IE 7 (Windows XP) to IE 8

� F5 Switch: Caching, Compression

Version HTTP 1.0

server

(broadband

connection)

HTTP 1.1

server

(broadband

connection)

Internet

Explorer�7 and

earlier

4 2

Internet

Explorer�8

6 6

Agile vs Waterfall

Company Confidential 20

Agile Waterfall

Pros Cons Pros Cons

Early Testing - Defect

Identification

Expensive Traditional performance requirements may be

addressed too late

Design validation Design may change

rapidly with

Requirements rendering

rework

Management typically

understands it or is

more familiar to it

Bad design if not found earlier

could severely impact application

and cause rework and delays

Component-Based (Fine

Grained)

Needs more

app/business team

involvement

Well formed SDLC Application-Based (Coarse-Grained)

Business Buy-in, quicker time-

to-market

Unstable Code, could run

into functional issues

Stable Code using after

QA has gone through

one Test Cycle

Scalability may be dependent on

modifying underlying components

Better End Product Rescripting required

which could be very

expensive to the the

rapidly changing

environments (builds,

etc)

Scripts more stable.

Usually less rescripting

required

Scripts more stable. Usually less

rescripting required

Typically more scalable as

testing done at component

level

Adhoc Project

Management

Scripts more stable.

Usually less rescripting

required

More bureaucracy typically

involved

Decreased Project Risk Usually more test cycles

needed over the

development of the

App/Product

Test Cycles are

typically limited in

alignment with Project

Timelines

More Collaboration and

Information Sharing

Test Environment may

not be prod-like

Test Environment is

typically prod-like

Page 13: Performance Testing Web 2.0 Applications—in an Agile World

4/16/2013

11

Learnings/Summary

Web 2.0 requires more manual/automated testing in concert with Performance tests

Agile Methodology provides performance benefits in exchange for additional cost

Identify Non-Functional requirements early in order to test for optimal design

Server Side Tuning is still vital

Client Side Tuning requires more developer involvement

Company Confidential 21

References

� http://msdn.microsoft.com/en-

us/library/cc304129(v=vs.85).aspx

� http://developer.yahoo.com/performance/

� http://blog.dynatrace.com/

� High Performance JavaScript (Build

Faster Web Application Interfaces)

Nicholas C. Zakas

Company Confidential 22