What tools and procedures would you recommend or use yourself to help streamline the following sceanario: (I know its a long one but any help is appreciated)
I work in a team that have an ecommerce app that we develop at our company. Its a reasonably standard LAMP application that we have been developing on and off for about 3 years. We develop the application on a testing domain, here we add new features and fix bugs etc. Our bug tracking and feature development is all managed within a hosted subversion solution (unfuddle.com). As bugs are reported we make these fixes on the testing domain and then commit changes to svn when we are happy the bug has been fixed. We follow this same procedure with the addition of new features.
It is worth pointing out there the general architecture of our system and application across our servers. Each time a new feature is developed we roll this update out to all sites using our application (always a server we control). Each site using our system essentially uses exactly the same files for 95% of the codebase. We have a couple of folders within each site which contain files bespoke to that site - css files / images etc. Other than that the differences between each site are defined by various configuration settings within each sites database.
This gets on to the actual deployment itself. As and when we are ready to roll out an update of some kind we run a command on the server that the testing site is on. This performs a copy command (cp -fru /testsite/ /othersite/) and goes through each vhost force updating the files based on modified date. Each additional server that we host on has a vhost that we rsync the production codebase to and we then repeat the copy procedure on all sites on that server. During this process we move out the files we dont want to be overwritten, moving them back when the copy has completed. Our rollout script performs a number of other function such as applying SQL commands to alter each database, adding fields / new tables etc.
We have become increasingly concerned that our process is not stable enough, not fault-tolerant and is also a bit of a brute-force method. We're also aware we are not making best use of subversion as we have a position where working on a new feature would prevent us from rolling out an important bug fix as we are not making use of branches or tags. It also seems wrong that we have so much replication of files across our servers. We're also not able to easily perform a rollback on what we have just rolled out. We do perform a diff before each rollout so we can get a list of files that will be changed so we know what has been changed after but the process to rollback would still be problematic. In terms of the database i've started looking into dbdeploy as a potential solution. What we really want though is some general guidance about how we can improve our file management and deployment. Ideally we want the file management to be more closely linked to our repository so a rollout / rollback would be more connected to svn. Something like using the export command to make sure the site files are the same as the repo files. It would also be good though if the solution maybe would also stop the file replication around our servers.
Ignoring our current methods it would be really good to hear how other people approach the same problem.
to summarise...
What is the best way for making files across multiple servers stay in sync with svn?
How should we prevent file replication? symlinks / something else?
How should we structure our repo so we can dev new features and fix old ones?
How should we trigger rollouts/rollbacks?
Thanks in advance.