How can I measure the response and loading time of a webpage?

One thing you need to take account of is the cache. Make sure you are measuring the time to download from the server and not from the cache. You will need to insure that you have turned off client side caching.

Also be mindful of server side caching. Suppose you download the pace at 9:00AM and it takes 15 seconds, then you download it at 9:05 and it takes 3 seconds, and finally at 10:00 it takes 15 seconds again.

What might be happening is that at 9 the server had to fully render the page since there was nothing in the cache. At 9:05 the page was in the cache, so it did not need to render it again. Finally by 10 the cache had been cleared so the page needed to be rendered by the server again.

I highly recommend that you checkout the YSlow addin for FireFox which will give you a detailed analysis of the times taken to download each of the items on the page.


If you just want to record how long it takes to get the basic page source, you can wrap a HttpWebRequest around a stopwatch. E.g.

HttpWebRequest request = (HttpWebRequest)WebRequest.Create(address);

System.Diagnostics.Stopwatch timer = new Stopwatch();
timer.Start();

HttpWebResponse response = (HttpWebResponse)request.GetResponse();

timer.Stop();

TimeSpan timeTaken = timer.Elapsed;

However, this will not take into account time to download extra content, such as images.

[edit] As an alternative to this, you may be able to use the WebBrowser control and measure the time between performing a .Navigate() and the DocumentCompleted event from firing. I think this will also include the download and rendering time of extra content. However, I haven't used the WebBrowser control a huge amount and only don't know if you have to clear out a cache if you are repeatedly requesting the same page.


Depending on how the frequency you need to do it, maybe you can try using Selenium (a automated testing tool for web applications), since it users internally a web browser, you will have a pretty close measure. I think it would not be too difficult to use the Selenium API from a .Net application (since you can even use Selenium in unit tests).

Measuring this kind of thing is tricky because web browsers have some particularities when then download all the web pages elements (JS, CSS, images, iframes, etc) - this kind of particularities are explained in this excelent book (http://www.amazon.com/High-Performance-Web-Sites-Essential/dp/0596529309/).

A homemade solution probably would be too much complex to code or would fail to attend some of those particularities (measuring the time spent in downloading the html is not good enough).