views:

1200

answers:

4

I currently ftp all my files to my website when i do an update (over a slowish adsl connection)

And I want to make things easier, so I just recently started using a hosted svn service, and i thought i could speed things up a bit by doing an svn export of my website directly onto my webserver

i have tried that a few times and it seems to work ok, however it does fetch the entire site everytime which is a bit slow for a 1 file update

so my questions are

is it possible to do an export and only get the changes since the last export (how will this handle deleted files ?)

OR will it be easier to do an svn checkout and svn update it all the time instead of svn export and just hide the .svn folders using apache htaccess

is this a good idea, or is there a better way to publish my website i am trying to achieve the 1 click deploy type ideal

maybe there are some gotcha's i haven't thought of that someone else has run into

debian/apache/php

+5  A: 

I would do an svn checkout, and have done so successfully on a live site for a number of years. You should add mod_rewrite rules to 404 the .svn directories (and files) though.

Draemon
RewriteEngine onRewriteRule .*\.svn/.* - [F]
Darren Newton
<FilesMatch "\.svn/.*"> order deny,allow deny from all</FilesMatch>
Darren Newton
Could not make FilesMatch directive to work as it does not match on directories, however DirectoryMatch worked fine: <DirectoryMatch "\.svn"> order allow,deny deny from all </DirectoryMatch>
Miquel
+2  A: 

This is what I'm doing on my host:

For every project I have a structure that looks more less like this:

~/projects/myproj
~/public_html/myproj

First dir is a checkout from SVN, while second one is just svn export.

I have a small bash script

#!/bin/bash
SOURCE="$HOME/projects/"
TARGET="$HOME/public_html/"
for x in `ls $SOURCE`
do
    if [ -d $SOURCE$x ]; then
        svn update $SOURCE$x
        svn export --force $SOURCE$x $TARGET$x
    fi
done

Export is done from working copy so it's very fast.

RaYell
what happens with files you have deleted from svn, do you remove them manually?
bumperbox
add a `rm -Rf $TARGET$x` after svn update command in the script above and you won't have to worry about those anymore
RaYell
A: 

It might not be exactly the answer you are looking for, but, if you have an SSH access to your webserver (it depends on your hosting service ; some "low cost" don't give such kind of access), you can use rsync to "synchronise" the remote site with what you have on your disk.

In the past, I was using something like the idea you are describing (fetching the svn log between last revision pushed to production and HEAD, analysing every lines, and in the end calculating what to send to the server) ; but it was not really a great process ; I now use rsync, and like it way better.

(Here too, you will have to exclude .svn directories, btw)

Pascal MARTIN
A: 

You can just accept having .svn directories in your website (generally not a problem esp. if you configure it not to permit access to these) - this is the easy option. Alternatively, do what RaYell does, and have two copies of your website on the webserver. One normal checkout outside of the web-directory, and one in your web-directory. When you update, simply export the svn (just a copy with .svn dirs deleted) into the web directory (and you should make sure to first delete old files if you wish to avoid files that have been removed from SVN from remaining on your website).

I do something like this, using robocopy set to mirror the svn checkout while excluding .svn directories, and get both the export and the old-file deletion in one step, thus minimizing downtime if the copy takes long. I'm sure this is easy on unix too, if that's your hosting environment. For example, you can use a local rsync: http://blog.gilluminate.com/2006/12/12/yes-you-can-rsync-between-two-local-directories/

Eamon Nerbonne