Hi,
I'm currently measuring the time spent to load a web page from a C# program.
The Visual Studio solution has a console program and an ASP.NET website with just one page. The website is hosted on ASP.NET Development Server. The console application queries the web page like this:
bool isSuccess;
Stopwatch timeSpentToDownloadPage = Stopwatch.StartNew();
WebRequest request = HttpWebRequest.Create(new Uri("http://localhost:12345/Test.aspx", UriKind.Absolute));
request.Timeout = 200;
using (WebResponse response = request.GetResponse())
{
Stream responseStream = response.GetResponseStream();
StreamReader sr = new StreamReader(responseStream);
string responseText = sr.ReadToEnd().Trim();
isSuccess = (responseText == "Hello World");
}
timeSpentToDownloadPage.Stop();
The web page has nothing special, just a response on load and no ASP.NET code:
protected void Page_Load(object sender, EventArgs e)
{
Response.Write("Hello World");
}
Now, the stopwatch shows every time that the code spent 0 ms. to do all the work (querying the server, getting the response, etc.).
How is it possible? Isn't there something wrong with what I'm doing? Really, I expected at least 10-20 ms. to execute all client-side code, then 100 ms. to:
- [client side] Find the page to query from uri (which does not require DNS query, so it's quite fast),
- [client side] Do the request,
- [server side] Initialize ASP.NET engine,
- [server side] Process the request,
- [server side] Find and read .aspx file and execute compiled code (at least reading file may cost several ms.),
- [server side] Build response, including headers, then send it,
- [client side] Receive the response and process it (trim).
So why is it so extremely fast? Is there a tricky cache which just skips all/most of the steps and return "Hello World" to the client?
If there is a "hidden" cache, where is it and how can I disable it to measure the "real" time spent?