views:

80

answers:

6

I want to measure the user perceived experience of web page loading. It is not sufficient to measure the time between receiving request and sending the response. Because that would be the server response time. The user's experience would depend on network latency and his bandwidth in addition to server response time. Any idea?

The web page loads in reasonable time in our lab. So it has to be something to do with latency and bandwidth. Can this investigation be done with some coding, or do I need sophisticated tools? I came across some websites that measure the upload/download speed of user's machine, and I am wondering how do these sites work?

A: 

yslow is a good tool that may help measure performance mostly relative content

LarsOn
This won't measure load time on other users machines.
Josh K
A: 

You could place a Javascript snippet at the top and bottom and then send the results to your server with an AJAX call.

Josh K
would this account for slow-loading images or other scripts?
p.campbell
@P.campbell: No, that would only show how long it took that DOM to load. In order to measure actual time of data transfer you would have to modify your server to log when it started sending the page (and all dependencies) and when it completed.
Josh K
A: 

The new window.performance API addresses this need, supported by IE9 and new Webkits, not sure about the others.

http://webtimingdemo.appspot.com/

Check that for a starting point.

Rich Bradshaw
A: 

If you want to see the effect latency (and bandwidth) has on your page loading time within your lab, try using a network emulator. There are a couple of open source and commercial network emulators. Google for "wan emulation". This is usually the simplest and most effective solution since it lets you can quickly find what's slowing your site down.

However, sometimes, the elements slowing your site down are not under your control, such as widgets, ads and 3rd party content. Moreover, if you are using a CDN, this may have a large impact as well. Therefore, you can not check everything inside your lab, and that's why companies like Gomez and Keynote exist. They have a large network of robots across the world and they constantly monitor your webpage load time.

You may want to try http://www.webpagetest.org/ which is a free tool that checks (and try to analyze) your webpage load time from a limited number of locations across the world.

Finally, if you want to monitor your true users (not just robots) experience, you may want to check out Yahoo's Boomerang, a free tool which also supports the webtiming api. Gomez has a similar commercial solution.

r0u1i
A: 

If you can run a test at a representative end-user site, then the Firebug plugin for Firefox is quite useful - the "Net" panel shows a time-series chart of the progress of the page load.

It makes it easy to see what's happening in parallel and what's serialised.

caf