We have an ASP.Net app, that behaves strangely under IIS6. The app itself is straightforward ASP.Net 2.0 Webforms deal, nothing too weird is going on in there (there are couple HTTP Modules in the pipeline, but I wouldn't consider those weird :) ). The thing I don't understand is the page execution times, or, more specifically, the difference between the time reported by ASP.Net tracing (trace.axd) and observed by the client (Fiddler). When the app is run on developer's box (WinXP, IIS5.1), the times reported by ASP.Net and Fiddler are very close:
Page exec time : 0.0919834 Fiddler Total Sequence time: 0.1560980
I can understand 60ms being spent getting 5KB worth of data from IIS to Fiddler (both of which run on the same machine, BTW). Now, when we move the code to the server (Win2k3, IIS6), the picture changes dramatically:
Page exec time : 0.1702014 Fiddler Total Sequence time: 0.5156283
This is same page, and Fiddler is again running on the same machine with the code. Why does it suddenly take 350ms to deliver the same 5KB?
PS. On both machines the results are obtained by accessing the URL via the actual machine's hostname, e.g. http://machinename/app/page.aspx (as opposed to http://localhost/app/page.aspx).
PPS. Configuration-wise, the setups of a dev box and the server are as close as we could make them -- both use the exact same web.config. Both hit the DB (sql server) with integrated auth, and, consequently, the app runs under a domain account. The app uses forms authentication, and does NOT impersonate (i.e. it always runs under the same account). Now, the way this works on IIS5 is different from IIS6 -- on IIS5 the account is specified in tag in the machine.config, and on IIS6 it's the AppPool setting. The setup seems pretty typical for both environments, and I can't imagine it causing 350ms delays...