views:

116

answers:

3

I am developing (solo web developer) a rather large web based system which needs to run at various different locations. Unfortunately, due to some clients having dialup, we have had to do this and not have a central server for them all. Each client is part of our VPN, and those on dialup/ISDN get dialed on demand from our Cisco router. All clients are accessable within a matter of seconds.

I was wondering what the best way to release an update to all these clients at once would be. Automation would be great as their are 23+ locations to deploy the system to, each of which is used on a very regular basis. Because of this, when deploying, I need to display a 'updating' page so that the clients don't try access the system while the update is partially complete.

Any thoughts on what would be the best solution

EDIT: Found FileSyncTask which allows me to rsync with Phing. Going to use that.

A: 

Hi,

At the company I work for, we work with huge web-based systems which are both Java and PHP. For all systems we have our development environments and production environments. This company has over 200 developers, so I guess you can imagine the size of the products we develop.

What we have done is use ANT and RPM build archives for creating deployment packages. This is done quite easily. I haven't done this myself, but might be worth for you to look into. Because we use Linux systems we can easily deploy RPM packages, the setup scripts within a RPM package can make sure everything gets to the correct place. Also you get a more proper version handling and release process.

Hope this helped you.

Br, Paul

Paul Peelen
That is a nice idea with the RPM packages.
Surim
It sure is. For the maintenance page issue I don't have to much experience. We have implemented this in our code with a DB flag, but it is never used unless emergencies. We want our products to always be available, therefore we use load-balancers. These help both with high load bt dividing them to different machines and upgrades.But in my personal use I have also used load-balancers locally on the same machine. The only reason then is to be able to update one instance at the time. The visitors can visit the other instance(s) during the upgrade time. This way your product is "never" down.
Paul Peelen
You could use symbolic links. That way the change from the old system to the new system will be instant. The problem still lies with the DB. Load balancing is a nice idea, but I just see it as more trouble than it is worth in this situation. You would need to make sure both database's get altered correctly etc.
Surim
Changes for the database are not always certain, and if there are updates for the database they should always be backward compatible. What happens if you would have to rollback the update you have made? Symlinks in all its glory is great, but for larger projects it can be very messy, certainly when you use a different release strategy then just copy it over.The application and the database should be seen as two different products, where both should be unreliable from each other. This way you always safely update. The less dependency's, the safer, faster and more reliable the process is.
Paul Peelen
A: 

There's 2 parts to this, lets deal with the simple one first:

I need to display a 'updating' page

If you need to disable the entire site while maintaining transactional integrity, and publishing a message to the users from the server being updated, then the only practical way to do this is via an auto-prepend - this needs to be configured in advance (note - I believe this can be done using a .htaccess file without having to restart the webserver for a new PHP config):

<?php

   if (file_exists($_SERVER['DOCUMENT_ROOT'] . '/maintenance.php')) {
       include_once($_SERVER['DOCUMENT_ROOT'] . '/maintenance.php');
       exit;
   }

Then just drop maintenance.php into your webroot and that file will be displayed instead of the expected file. Note that it should probably include a session_start() and auto-refresh to ensure the session is not expired. You might want to extend the above to allow a grace period where POSTs will still be processed e.g. by adding a second php file.

In terms of deploying to remote sites, I'd recommend using rsync over ssh for copying content files - which should be invoked via a controlling script which:

  1. Applies the lock file(s) as shown above
  2. runs rsync to replicate files
  3. runs any database deployment script
  4. removes the lock file(s)

If each site has a different set up then I'd recommend either managing the site specific stuff via a hierarchy of include paths, or even maintaining a comlpete image of each site locally.

C.

symcbean
Got a .htaccess file which redirects all traffic to a specified URL
Surim
+1  A: 

There's also a case here for maintaining a "master" code repository (in SVN, CVS or maybe GIT). This isn't your standard "keep editions of your code in the repo and allow roll backs"... this repo holds your current production code (only). Once an update is ready you check the working updated code into the master repo. All of your servers check the repo on a scheduled bases to see if it's changed, downloading new code if a change is found. That check process could even include turning on the maintenance.php file (that symcbean suggested) before starting the repo download and removing the file once the download is complete.

Rudu