Ajax Performance

Presentation by Ryan Breen, VP Technology at Gomez (think of it as, go! mez).

Firefox shows you the time elements are parsed, not the time it actually took to load it.
– Elements shouldn’t take longer and longer to load.
– Net visualization is nice, but certainly not good enough.

WebKitInspector
– Safari’s clone of FireBug, same issues though.

IBM Page Detailer
– The professional of the group, not browser specific and it hooks into the network stack (just like fiddler) to gather performance load times of files.
– Breaks it down into connect time, discussion time, and download time.
We’re talking about where we are spending our download time.
These are all freely available tools so far.

The server based profiling applications exist, he even listed a few, they own one, and Pingdome owns one, but they got the

attention of dung beetles so off to firebug.

JS Function call profiler in firebug, world class, IS the be all end all to js debugging in firefox.

jsLex instruments code to test it using an Ant task. Ryan wasn’t to familiar with jsLex, so we didn’t get into it, but if you have seen the video’s its very slick. Its starting to seem common that instrumentation is used to give something that many people want, yet we just don’t have hooks for. (Yo! Maybe we need some hooks!) JSCoverage being the other that I can think of.

Pure JS Solutions.
Dojo.Profile and Firebug Lite.
– Less data collected, only collected in specific areas.
– Cross browser! Only collected in specific areas!

Larest opportunity for optimization exists at the network layer.
The goal of performance optimization is to hide latency from the end user.

Fewer requests means less latency.
Make less requests as the initiation, and discussion is the most expensive cost of this area.

CSS Sprites, less web requests.

If your connection persistance is broken, fixing it will give you an automatic perf gain, possibly as large as 50%. This could be like removing removing one olive from the jar and saving your company thousands of dollars with one little idea. If this is happening, you could earn yourself a nice little bonus by getting it addressed.

Multiple connections per domain, browsers natively only download 2 items per server, with dns wildcards for sub domains you can up. Which means if you download 30 items, you’ll have to wait 15 times, but if you got 6 connections going, you only have to wait 5. The download times have matured quite a bit in the age of broadband, but the connections still cost as much to execute, invoke, and discuss with the server. This can break caching, because of the different urls. Its always 6 total for IE, so you can’t do this to just up it to 30, though if you did that, you run the risk of overloading your own web server with normal web traffic.

Flash, Flex, Applets, ActiveX, Silverlight just because you guys are plugins is no excuse for not profiling your performance.

Website of Note:
http://ajaxperformance.com/ (Nice)

No performance tools we know of that traces the hardware of the client, so you know an image took 2 seconds to download, and .5 seconds to load, but you don’t know that it pegged the users CPU/GPU. It sounds like a great opportunity for someone to write an extension to track CPU and Memory usage problems.

What about GZip latency? Does the user notice?  We used to need hardware (BigIP) just to do that extra work, but now CPU’s are strong enough that they do it themselves and the times are negligable.

This isn’t as clean as me fumbling around trying to remember a bunch of stuff, but I think its more informative, anyone want to give a shout to which they prefer? Please?

Off to anther keynote.

TTFN

2 comments

2 Responses to “TAE: Day 1 Ajax Performance”

  1. Ed Says:

    The link to JSCoverage is broken: should be http://siliconforks.com/jscoverage/

  2. Justise Says:

    Hey thanks ed, thats kind of embaressing. Its fixed now of course.

Leave a Reply