Paruj Ratanaworabhan Kasetsart University, Thailand Ben Livshits and Ben Zorn Microsoft Research, Redmond JSMeter: Characterizing the Behavior of JavaScript Web Applications 1 in collaboration with David Simmons, Corneliu Barsan, and Allen Wirfs-Brock
32
Embed
JSMeter : Characterizing the Behavior of JavaScript Web Applications
JSMeter : Characterizing the Behavior of JavaScript Web Applications. Paruj Ratanaworabhan Kasetsart University, Thailand Ben Livshits and Ben Zorn Microsoft Research, Redmond. in collaboration with David Simmons, Corneliu Barsan, and Allen Wirfs-Brock. Background. Who is this guy? - PowerPoint PPT Presentation
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Goals of JSMeter Project• Instrument JavaScript execution and measure behavior• Compare behavior of JavaScript benchmarks against real sites• Consider how benchmarks can mislead design decisions
6
How We Measured JavaScript
\ie\jscript\*.cpp
Source-level instrumentation
custom jscript.dll
custom trace fileswebsite visits
Offline analyzers
custom trace files0%
10%20%30%40%50%60%70%80%90%
100%
Constant
Other Str Ops
Concat Op
Visiting the Real Sites
• Getting past page load performance• Attempted to use each site in “normal” way:
7
amazon Search a book, add to shopping cart, sign in, and sign outbing Type in a search query and also look for images and newsbingmap Search for a direction from one city to anothercnn Read front page newsebay Search for a notebook, bid, sing in, and sign outeconomist Read front page news, view comments
facebookLog in, visit a friend pages, browse through photos and comments
gmail Sign in, check inbox, delete a mail, and sign outgoogle Type in a search query and also look for images and newsgooglemap Search for a direction from one city to anotherhotmail Sign in, check inbox, delete a mail, and sign out
handlers• Example handlers: onabort, onclick, etc.• Very different from batch processing of
benchmarks• Handler responsiveness critical to user
experience
22
Code|Objects|Events
Total Events Handled
23
amazo
nbing
bingmap cnn
ebay
econom
ist
faceb
ookgmail
google
googlem
ap
hotmail
richard
s
deltab
luecry
pto
raytra
ceea
rleyreg
expsp
lay0
1,000
2,000
3,000
4,000
5,000
6,000
7,000To
tal E
vent
s H
andl
ed
Real Sites V8
Almost no events
Code|Objects|Events
Median Bytecodes / Event Handled
24am
azon
bing
bingm
ap cnn ebay
econo
mist
faceb
ook
gmail
goog
le
goog
lemap
hotm
ail0
50
100
150
200
250
300
350
400
450
500M
edia
n By
teco
des
/ Eve
nt H
andl
ed
Code|Objects|Events
506 2137
Sure, this is all good, but…
• Everyone knows benchmarks are unrepresentative
• How much difference does it make, anyway?• Wouldn’t any benchmarks have similar issues?
25
Cold-code Experiment• Observation
– Real web apps have lots of code (much of it cold)– Benchmarks do not
• Question: What happens if the benchmarks have more code?– We added extra, unused to code to 7 SunSpider
benchmarks– We measured the impact on the benchmark
performance
26
Performance Impact of Cold Code
27
3d-raytr
ace
access
-nbody
bitops-n
sieve
controlflow
crypto-ae
s
math-co
rdic
string-t
agcloud
0
100
200
300
400
500
600
700
800
900
0K200K400K800K1M2M
Tim
e (m
sec)
3d-raytr
ace
access
-nbody
bitops-n
sieve
controlflow
crypto-ae
s
math-co
rdic
string-t
agcloud
0
100
200
300
400
500
600
700
800
900
0K200K400K800K1M2M
Tim
e (m
sec)
Chrome3.0.195.38
IE 88.0.601.18865Cold code has
non-uniform impacton execution time
Cold code makesSunSpider on Chromeup to 4.5x slower
Impact of Benchmarks• What gets emphasis
– Making tight loops fast– Optimizing small amounts of code
• Important issues ignored– Garbage collection (especially of strings)– Managing large amounts of code– Optimizing event handling – Considering JavaScript context between page loads
28
Conclusions• JSMeter is an instrumentation framework
– Used to measure and compare JavaScript applications– High-level views of behavior promote understanding
• Benchmarks differ significantly from real sites– Misleads designers, skews implementations
• Next steps– Develop and promote better benchmarks– Design and evaluate better JavaScript runtimes– Promote better performance tools for JavaScript
developers
29
Additional Resources
• Project: http://research.microsoft.com/en-us/projects/jsmeter/ • Video: Project JSMeter: JavaScript Performance Analysis in the Real World
" - MSDN Channel 9 interview with Erik Meier, Ben Livshits, and Ben Zorn
• Paper:– “JSMeter: Comparing the Behavior of JavaScript Benchmarks with Real Web
Applications”, Paruj Ratanaworabhan, Benjamin Livshits and Benjamin G. Zorn, USENIX 2010 Conference on Web Application Development (WebApps’10), June 2010.