views:

2129

answers:

3

I am working with a legacy ASP Classic solution which is load balanced (via external hardware and has an IIS site the home directory of which is an UNC path. I've been told that the following issues with this setup currently exist:

  1. When using an UNC path as home directory, there is an "index" somewhere in IIS which "caches" up to a certain amount of files of certain types, and when the limit, which defaults to 50, has been reached, subsequent requests to pages not in the cache will return 404.
  2. When using an UNC path as home directory, when starting the IIS site, the aforementioned "cache" will start filling, which will bog down the site IIS until the cache is filled, meaning that huge sites (15,000 .asp files) are unavailable for up to 30 minutes after the IIS site starts.
  3. When using an UNC path as home directory, if more than a certain number of simultaneous requests are made to the site, Windows will reach the "Network BIOS command limit per server", and all requests above the limit will have to wait until IIS "closes the session" to the server. I am told the limit is 100 files and not configurable.

Now, all this sounds a bit weird. If I set up a new Windows 2003 server with default settings, and use it to host an ASP Classic application with 15,000 .asp files, using a share on a server as the home directory for the IIS site, will I actually run into these problems? And if so, is there a way to counter them without changing the architecture?

(To clarify, the only reason the "load balancing" is important is that load balancing is the reason the files are on a share on a server. If no load balancing was needed, the files could be on the local disk.)

+1  A: 

For answer 3 you can change the Network BIOS command limit. Its a pretty easy registry edit fix: http://support.microsoft.com/kb/810886/en-us

I have run into that particular issue myself.

Gthompson83
+2  A: 

I'm not sure about your direct question on the interaction between IIS and UNC, but I would suggest on a busy site (anything busy enough to require load balancing), that you consider something other than a file share.

An asp loaded by IIS across a network (i.e. a file share) will suffer negative performance implications (latency).

I would suggest using something like robocopy to keep all load balanced servers in sync with a central master. In other words, deploy to a single master server (or single master location), then robocopy the files to each slave in the load balancer's pool.

This will not only remove the wierd UNC issues you describe, but should also give you a nice performance boost (by removing the network hit when loading asp pages). I would expect pretty heavy performance boosts if you did this.

therealhoff
Yeah, I agree. Actually, I'd prefer DFS over RoboCopy - more configuration, but less sleepless nights. As usual, though, external factors dictate this particular set-up.
bzlm
+2  A: 

Yes, it is possible, but yes, it can cause problems.

When ASP.NET compiles ASPX, ASCX, and other content pages into assemblies, it creates a lot of FileSystemWatchers in order to monitor the dependencies between them so that when files change, it can recompile. These eat up NetBIOS resources.

Additionally, every time you do a File.Exists or Directory.Exists call, or any other kind of IO to the site's serving path, that increases the demands on the NetBIOS limits as well.

It is possible to set the NetBIOS limits through the registry to above their defaults to a point.

For a small site, with relatively few directories and files, you could very successfully run off a UNC share because ASP.NET will continue to run after startup off of its compiled assemblies. However, the more directories and files you add, the more likely problems are to crop up.

We tried running a mammoth site (hundreds of directories and ASPX/ASCX files) and it would run fine for a few minutes until enough urls were accessed that the NetBIOS limits were reached, and then every subsequent page view resulted in an exception. We ended up forced to use a robocopy publishing solution.

In the end, you have to test to see if your site is small enough and your NetBIOS settings are high enough to run effectively. I would suggest using a spider on a test site so that you can be sure that everything that could be compiled or accessed is at least once.

David
In the end, everything runs smoothly, after adjusting the NetBIOS limits etc. The start-up latency for FileSystemWatchers and ASP compilation turned out to be tolerable.
bzlm