views:

197

answers:

5

Over the last few weeks I've been subject to a sudden and significant performance deterioration when browsing locally hosted ASP.NET 3.5 MVC web applications (C#). Load times for a given page are on average 20 seconds (regardless of content); start up is usually over a minute. These applications run fast on production and even test systems (Test system is comparable to my development environment).

I am running IIS 6.0, VS2008, Vista Ultimate, SQL2005, .NET 3.5, MVC 1.0, and we use VisualSVN 1.7.

My SQL DB is local and IPv6 does not seem to be the cause. I browse in Firefox and IE8 outside of Debug mode using loopback, machine name, and 'localhost' and get the exact same results every time (hence DNS doesn't seem to be the issue either).

Below are screen shots of my dotTrace output.

http://www.glowfoto.com/static_image/28-100108L/3123/jpg/06/2010/img4/glowfoto

This issue has made it near impossible to debug/test any web app. Any suggestions very much appreciated!

A: 

Try creating a new, default MVC2 application in a new web folder. Build and browse it. If your load times are okay with the new app, then there's something up with your application. If not, it's outside of the context of the app and you should start looking at IIS config, extensions, hardware, network, etc.

In your app, back up your web config and start with a new, default web.config. That should disable any extensions or handlers you've installed. If that fixes your load times, start adding stuff from the old web.config into the new one in small blocks until the problem reappears, and in that way isolate the offending item.

I call this "binary search" debugging. It's tedious, but actually works pretty quickly and will most likely identify the problem when we get stuck in one of those "BUT IT SHOULD WORK!!!" modes.

Update Just a thought: to rule out IIS config, try running the site under Cassini/built-in dev server.

David Lively
Thank you for your suggestion David.The new MVC project loaded up relatively quickly. However, after starting with a default web.config on the project in question resulted in the same slowness, even with only the minimal sections readded.
alan
Huh. Well, at least this tells you that there is something in your app causing the issue. Anything new in your global.asax.cs? Also, your IIS folder is local, right?
David Lively
I've got to figure out how to run Cassini rather than IIS, but I'll give it a try. No, nothing new in my Global.asax. Yes, IIS folder runs directly from my project Bin.
alan
Quick question; I just realized that in the project properties on the Web tab I ask it to use Visual Studio Development Server, and I have not specified an IIS server. Is it safe to say that IIS is not the cause if I experience the same slowness in debug?IIS is local.
alan
Both Debug and Locally browsed apps experience the same slowness.
alan
@Alan, since you've got an app that works correctly and one that doesn't, I'd start moving parts over from one to the other. I realize that is a huge PITA in any sizeable application, but at least you'll be able to make progress. Also, are you sure you're not eating exceptions that might be letting you what's wrong? Database unavailable, etc?
David Lively
Thank you so much for keeping up on this thread, I was afraid I'd been forgotten. I created a second (less desirable) development environment on a spare pc and the app works perfectly with code that is identical to that of my original (slow) dev environment.I'm not sure where to go from here, I've eliminated the code as the culprit.
alan
Stupid question: you said a couple of comments back that this happens when running under both Cassini and IIS. When you browse the site on the first (broken) dev machine, are you hitting the port that is assigned to IIS is using or the one for Cassini?Cassini is using?
David Lively
I am not entirely sure if I'm using Cassini, I was using "Use Visual Studio Development Server" on the "Web" tab of the project properties, this auto-assigns a port dynamically.
alan
I am not sure if this bypasses the IIS hosting. An example url when using the VS Dev Server is https://localhost:23423/, and browsed would be https://localhost/.
alan
if you're browsing a different port than the one used by the dev server, then you're most likely going through IIS. In that case VS won't catch any exceptions that are happening in your app, and IIS config becomes a significant factor in whether or not your application will work. Browse through localhost:23423 instead.
David Lively
I attempted to browse localhost:23423 and received a page not found error; this makes sense since it should only show on 443 and 80.
alan
A: 

You could download Fidler to measure how long each call takes and get some measurements.

http://blogs.msdn.com/b/tess/archive/2009/11/06/recap-of-oredev-and-some-net-debugging-videos.aspx

This video might help...

Thanks for the suggestion, is that different than the DotTrace output I posted? It provides time measurements for calls.I just need to know before I devote the time to the process of downloading, installing, logging and posting the results (I have many projects on my plate).
alan
You may not get a whole of difference with Fidler. However, in the video you can see the web app slows and they take stack dump and are able to analyze the threads of why the app has slowed. I would check that approach out.
+1  A: 

Surely the big red flag on that profiler output is the fact that AddDirectory is called 408 times and AddExistingFile is called 66,914 times?

Can you just confirm that there's not just a shed load of directories and files underneath your MVC app's root folder? Because it looks like the framework is busying itself trying to work out what files it needs to build (or add watches to) on startup.

[I am not au fait with MVC and so maybe this is not what is happening but 67k calls to a function with a name like "AddExistingFile" does smell wrong].

Chris F
There is infact 53308 files and 1684 folders under the root directory of the Solution. For the specific project 27029 files and 445 folders. I have been unloading all other projects in the solution while working with the one in question, however they all run just as slow.
alan
I think it may be time to consider splitting the project up. Group classes into logical units and refactor them into separate class libraries. Remember, even if you change only one file in the project, every single file must be recompiled. If you split the project up, only the files in the project that changed will get rebuilt. I had one solution's typical build time go from over a minute build time to 2 seconds when I did that. Also, if you have any DBMLs I always put those in their own project due to their large size.
Ryan
@Ryan Unfortunately, it's not my call to split them up; I'm only a junior developer. However, I achieve the same results with "Unload Project" (Visual Studio 2008) which I can use to exclude irrelavent projects from the build.
alan
@alan You could achieve the same result by defining your own build configurations. They're in the Build > Configuration Manager menu.
Ryan
+1  A: 

I've learnt that it's usually a "smell" when things fail near a power of two ...

Given

Over the last few weeks I've been subject to a sudden and significant performance deterioration

and

AddExistingFile is called 66,914 times

I'm wondering if the poor performance hit at about the time as the number of files exceeded 65,535 ...

Other possibilities to consider ...

  • Are all 66,914 files in the same directory? If so, that's a lot of directory blocks to access ... try a hard drive defrag. In fact, it's even more directory blocks if they're distributed across a bunch of directories.

  • Are you storing all the files in the same list? Are you preseting the capacity of that list, or allowing it to "grow" naturally and slowly?

  • Are you scanning for files depth first or breadth first? Caching by the OS will favor the performance of depth first.

Update 14/7

Clarification of Are you storing all the files in the same list?

Naive code like this first example doesn't perform ideally well because it needs to reallocate storage space as the list grows.

var myList = new List<int>();
for (int i=0; i<10000; i++)
{
    myList.Add(i);
}

It's more efficient, if you know it, to initialize the list with a specific capacity to avoid the reallocation overhead:

var myList = new List<int>(10000);  // Capacity is 10000
for (int i=0; i<10000; i++)
{
    myList.Add(i);
}

Update 15/7

Comment by OP:

These web apps are not programmatically probing files on my hard disk, at least not by my hand. If there is any recursive file scanning, its by VS 2008.

It's not Visual Studio that's doing the file scanning - it is your web application. This can clearly be seen in the first profiler trace you posted - the call to System.Web.Hosting.HostingEnvironment.Initialize() is taking 49 seconds, largely because of 66,914 calls to AddExistingFile(). In particular, the read of the property CreationTimeUTC is taking almost all the time.

This scanning won't be random - it's either the result of your configuration of the application, or the files are in your web applications file tree. Find those files and you'll know the reason for your performance problems.

Bevan
+1 There is probably a problem with the way the system is designed if you need to access each file in a directory several times when you load a page. Also he will get this problem in production when the number of files increases.
Shiraz Bhaiji
@Bevan Can you suggest a high-quality defragger? I used the default windows one to no avail.How do you mean "storing all files in the same list"? These web apps are not programmatically probing files on my hard disk, at least not by my hand. If there is any recursive file scanning, its by VS 2008.
alan
@Shiraz I have a comparable test environment and second development environment, both of which do not experience this issue.
alan
I've used [MyDefrag](http://www.mydefrag.com/index.html) to pretty good effect.
Bevan
I used MyDefrag, no improvement in performance. Any suggestions on how to locate where these 69k + calls are coming from? I'm confused, it must be some assembly added recently. However I was unable to pinpoint which assembly it could be.
alan
[Process Monitor](http://technet.microsoft.com/en-us/sysinternals/bb896645.aspx) from SysInternals will show you file system activity, should give you a pointer in the right direction.
Bevan
If I'm being stubborn let me know, but I think the file scanning is a symptom and not a cause. I run this very same app with the very same configuration on no less than 4 other machines; only one of which is a server in the truest sense of the term. The rest of the systems are desktop development environments and test systems.
alan
File scanning may well be a symptom - but it's your own performance analysis that shows that the majority of the startup time is being taken by the call to `HostingEnvironment.Initialize()`. If you don't believe the results, measure again - and again.
Bevan
A: 

The solution was to format and do a clean install of Vista, SQL Server 2005, Visual Studio 2008, IIS6 and the whole lot. I am now able to debug, without consequence, the very same webapp(s) I was experiencing the problems with initially. This leads me to believe the problem lay within one of the installations above and must have been aggravated by a software update or by the addition of software.

alan