Since it is 'perceived load time' and not 'actual load time' that is arguably the most important metric, it is difficult to measure precisely and consistently, specifically since it all depends on perception - which can vary according to the user and the nature of the page in question.

For example, I'll frequently fire up an informational page, whereby I can be happily reading the content long before the page is fully loaded. Equally, when I want to log into any number of websites, I have my username and password stored by the browser... but frequently, the page appears to have loaded several seconds before my stored username/password are automatically populated by the browser - clearly the page wasn't fully loaded when it appeared to be.

My point is, the point at which I can progress with what I want to do is partially determined by the nature of the page in question - I don't see how you can automatically determine the point at which a page could be considered usable.

If you need a consistently measurable metric, you can stick with what you have. If you want a more accurate metric (the point at which a page can be considered usable), it will probably require a human judgement.


there are two good tools i know off for measuring website performance:

yslow by yahoo and page speed by google.
these tools will give you a good overview where your page is spending time and give you some hints how to do it better.

here are also some good blogs about page performance:

High Scalability

High Performance Web Sites

in these blogs you might get some new perspectives and ideas about website performance.

EDIT: here is an article which discuses performance.

EDIT2:
it seems it is now becoming even more important, as google calculates page rank also by speed: http://searchengineland.com/google-now-counts-site-speed-as-ranking-factor-39708

EDIT3:
here is a page with numbers correlating speed and business from google, bing, yahoo, mozilla and some others.


Sounds to me like you need a script to download the web page but skip anything linked to something outside the base URL. That would give you the load time for the page in a way that actually means something in regard to optimisation. I don't know of such a script off hand though.


Its not a black and white question IMO. For you on a fast PC with a broadband connection and a modern web browser, waiting for some secondary element to load may not be a big deal in terms of "perceived page load time". For you the difference is marginal.

But for the guy at a corporate branch office with a bonded T1 shared by 100 people, running Internet Explorer through a centralized proxy that runs everything through a security appliance (like McAfee, WebRoot, Finjan, etc) on each page load, things are different. The difference between "the page looks loaded" and when it is actually loaded could take seconds -- a big deal. Sometimes security appliances won't deliver the page until everything is done loading.

You should be demanding that your developers or vendors deliver quality services. If its taking 5 seconds to load a web advertisement, there is no ad in front of your visitor's eyeball.