tags:

views:

328

answers:

5

Hey,

We've got a process currently which causes ASP.NET websites to be redeployed. The code is itself an ASP.NET application. The current method, which has worked for quite a while, is simply to loop over all the files in one folder and copy them over the top of the files in the webroot.

The problem that's arisen is that occasionally files end up being in use and hence can't be copied over. This has in the past been intermittent to the point it didn't matter but on some of our higher traffic sites it happens the majority of the time now.

I'm wondering if anyone has a workaround or alternative approach to this that I haven't thought of. Currently my ideas are:

  1. Simply retry each file until it works. That's going to cause errors for a short time though which isn't really that good.
  2. Deploy to a new folder and update IIS's webroot to the new folder. I'm not sure how to do this short of running the application as an administrator and running batch files, which is very untidy.

Does anyone know what the best way to do this is, or if it's possible to do #2 without running the publishing application as a user who has admin access (Willing to grant it special privileges, but I'd prefer to stop short of administrator)?

Edit
Clarification of infrastructure... We have 2 IIS 7 webservers in an NLB running their webroots off a shared NAS (To be more clear, they're using the exact same webroot on the NAS). We do a lot of deploys, to the point where any approach we can't automate really won't be viable.

A: 

You could also try to modify the timestamp of web.config in the root folder before attempting to copy the files. This will unload the application and free used files.

Darin Dimitrov
There's still the race condition between the copy process and IIS recycling the worker process. So yeah, it might work, but it might not either.
Franci Penov
I'd have thought that'd make it more likely to be accessing the files if a new request came in cause it'd have to reload everything? Unless it's betting around getting in before the next request, and if that's the case I really don't think that's likely on the sites I'm talking about. One in particular runs API's that get hit extremely often.
Tim Schneider
+7  A: 

What you need to do is temporary stop IIS from processing any incoming requests for that app, so you can copy the new files and then start it again. This will lead to a small downtime for your clients, but unless your website is mission critical, that shouldn't be that big of a problem.

ASP.NET has a feature that targets exactly this scenario. Basically, it boils down to temporarily creating a file named App_Offline.htm in the root of your webapp. Once the file is there, IIS will takedown the worker process for you app and unload any files in use. Once you copy over your files, you can delete the App_Offline.htm file and IIS will happily start churning again.

Note that while that file is there, IIS will serve its content as a response to any requests to your webapp. So be careful what you put in the file. :-)

Franci Penov
Hey, good advice - we have this feature in but not every site uses it. One of the ones that's the biggest issue though does use it and it's not really solving the issue.... do we have to wait after putting this file in, or does it not work on virtual application directories? It seems to take the site down, so I'd have thought it'd release handles too, but IIS seems to feel free to hold them a few minutes after...
Tim Schneider
Along these lines, we have multiple web servers in a farm and drop them out individually for updates. The process is: 1. pull server from farm, 2. update it's code, 3. run iisreset, 4. put it back in rotation, 5. move on to the next server.
Chris Lively
Intriguing idea, but we only have two webservers and they both run off a shared NAS for their webroot (Redundancy rather than performance - the NAS is redundant also), so the files are in use by both the webservers. Plus with the number of sites we deploy (20 in the last week - though only 1 of those actually takes enough traffic to get this issue) it'd be a huge overhead without a tool that managed the process.
Tim Schneider
@fyjham - that will be why the App_Offline's not working for you - as both servers are using the same file system, if you have one up and running, it will potentially always have a lock on some files.
Zhaph - Ben Duguid
But surely the app offline being on the shared filesystem should affect both webservers?
Tim Schneider
Won't let me edit to add more details... the one app_offline takes both servers offline, but they STILL hold locks to the file-system even with the app offline up (At least for a short time - tends to let go some time in the next few mintues).
Tim Schneider
Hey, this didn't solve the issue but it was the best answer and the bounty's about to run out so I'll select it. If anyone has any idea why it's not working under my setup though I'd love to know.
Tim Schneider
A: 

Unless you're manually opening a handle to a file on your web server, IIS won't keep locks on your files.

Try shutting down other services that might be locking your files. Some examples of common services that do just that:

  • Windows Search
  • Google Desktop Search
  • Windows Backup
  • any other anti-virus or indexing software
Fábio Batista
Hi, this is running on a sever. We don't have any of that stuff other than anti-virus, and I've confirmed already that it's IIS that holds the locks to the files. The files in question are typically DLL's and the DLL in question changes, and I can't see any reason anyone would be opening explicit file handles to the DLL's (If it was images or text files or something I might look into that, but DLL's it seems unlikely).
Tim Schneider
+2  A: 

Another solution is IIS Programmatic Administration.

Then you can copy your new/updated web to an alternative directory then switch the IIS root of your webapp to this alternative directory. Then you don't matter if files are locked in the original root. This a good solution for website availability.

However it requires some permission tuning...

You can do it via ADSI or WMI for IIS 6 or Microsoft.Web.Administration for IIS 7.

About your 2., note that WMI don't require administrator privileges as ADSI do. You can configure rights by objects. Check your WMI console (mmc).

JoeBilly
+1  A: 

Since you're already load balancing between 2 web servers, you can:

  1. In the load balancer, take web server A offline, so only web server B is in use.
  2. Deploy the updated site to web server A.
  3. (As a bonus, you can do an extra test pass on web server A before it goes into production.)
  4. In the load balancer, take B offline and put A online, so only web server A is in use.
  5. Deploy the updated site to web server B.
  6. (As a bonus, you can do an extra test pass on web server B before it goes into production.)
  7. In the load balancer, put B back online. Now both web servers are upgraded and back in use in production.
  8. List item
C. Dragon 76
Hey, I've clarified my question a bit more, when I said they used a shared NAS I meant they had the same webroot (Meaning if we turn off 1 webserver the other will still use it's files). Plus unless there's a way to automate this process it's still not really viable (We do as many as 20 website deploys in a week - not of the same sites of course - and it's just too much of an overhead unless it's inevitable). Gave it an up-vote though cause it is a pretty good answer, it's just the specifics of our setup that make it not work.
Tim Schneider
I see. What's your reasoning for hosting the web application files on a NAS?
C. Dragon 76