views:

287

answers:

4

Hello,

currently my work-flow is as follows:

Locally on a machine I maintain a git repo on each website I am working on, when the time comes to publish something I compress the folder and upload this single file to the production server via ssh then I decompress, test the changes a move the changes to the live folder and I get rid of the .git folder.

I was wondering if the use of a git repo on the live server was a good idea, seems to be at first but it can be problematic if a change doesn't look the same on on the production server in comparison to the local development machine... this could start a fire... What about creating a bare repo on some folder on production server then clone from there to the public folder thus pushing updates from local machine to the bare repo and pulling from the bare on the public folder of the production server... may anyone plese provide some feedback.

Later I read about capistrano http://capify.org but I have no experience w/ this software...

In your experience what is the best practice/methodology to accomplish a website deployment/updates?

Thanks in advance and for your feedback.

A: 

I never thought about having a repository copy on the server. After reading it, I thought it might be cool... However, updating the files directly in the live environment without testing is not a great idea.

You should always update a secondary environment matching exactly the live one (webserver + DB version, if any) and test there. If everything goes well, then put the live site under maintenance, update files, and go live again.

So I wouldn't make the live site a copy of the repository, but you could do so with the test env. You'll save SSH + compressing time, plus you can check out any specific revision you'd like to test.

Seb
Actually is how I use to work 2 servers one the live/production area and the other the sandbox, but actually I only have one server. On the sandbox I use to have a copy of the repos and update them on a regular basis.Guess I will do a sanbox folder an start from there, thanks.
+2  A: 

I don't think that our method can be called best practice, but it has served us well.

We have several large databases for our application (20gb+), so maintaining local copies on each developers computer has never really been an option, and even though we don't develop against the live database, we do need to do the development against a database that is as close to the real thing as possible.

As a consequence we use a central web server as well, and keep a development branch of our subversion trunk on it. Generally we don't work on the same part of the system at once, but when we do need to do that, or someone is making a lot of substantial changes, we branch the trunk and create a new vhost on the dev server.

We also have a checkout of the code on the production servers, so after we're finished testing we simply do a svn update on the production servers. We've implemented a script that executes the update command on all servers using ssh. This is extremely convinient, since our code base is large and takes a lot of time to upload. Subversion will only copy the files that actually have been changed, so it's a lot faster.

This has worked really well for us, and the only thing to watch out for is making changes on the production servers directly (which of course is a no-no from the beginning) since it might cause conflicts when updating.

Emil H
Checkout of the code on production? Don't the .svn folders bug you?Export seems to be the better option here: http://stackoverflow.com/questions/175056/svn-checkout-or-export-for-production-environment
barfoon
With export we would lose the advantage of only copying the changed files, which would make deployment much slower. Also, I believe that the update operation is atomic, while export isn't, but I could be wrong.
Emil H
And, our directory structure is such that only default.php and some static content is in the web root, and as an extra safety we block access to .svn folders using apache.
Emil H
A: 

There are some nice tutorial on deploying with capitstrano, one from github and from a blog

mpeterson
Will check, thank you... I remembe giving a quick reading to the one hosted on github :)
A: 

Capistrano is great. The default recipes The documentation is spotty, but the mailing list is active, and getting it set up is pretty easy. Are you running Rails? It has some neat built-in stuff for Rails apps, but is also used fairly frequently with other types of webapps.

There's also Webistrano, which is based on Capistrano but has a web front-end. Haven't used it myself. Another deployment system that seems to be gaining some traction, at least among Rails users, is Vlad the Deployer.

Sarah Mei
I am no into Rails sadly I am more 101 Ruby guy, I more into Python but I couldn't find somethig like capistrano for Python so I think is time to learn more Ruby or fork LOLAnyways the capistrano app seems to be a good option to go will read more about and give a try soon.Thankyou.