views:

255

answers:

3

I have a website that I work on locally and is hosted with a web host on another server. How would I use SSH to automatically push my local files to the production server? Would it possible to setup a list of files that are on my local machine that I do not want uploaded and can I have it upload only the files that have changes?

EDIT: Thanks for the replies, I actually just started using git today so I'll have to look into it.

A: 

I would use scp.

You could write a simple script to stage your upload files to certain directory (that way you exclude the other files), then push them to you server that way.

I use ssh and cvs. (the basic idea is there, just ignore the university specific stuff)

I check in my code on a regular basis, then I label it. On the webserver, I check out the latest label. It makes backing out super easy and updating super easy as well.

Then I never have to worry about files getting there that I don't want and because it is tunned via SSH it is totally secure.

Both the webserver and cvs server are linux boxes which really simplifies things.

Take a look at WinSCP as well.

cbrulak
+7  A: 

Sounds like rsync would be a great tool for you to use. It can run over SSH and can figure out what has changed. You can tell it to ignore files as well.

Rob Di Marco
+1 I used to user rsync for disaster recovery back in the day. It's pretty slick.
Chris Kloberdanz
+3  A: 

As Rob Di Marco mentions, rsync is a great way to copy your files. Another option I've recently switched to is using git for this purpose, which also works over ssh and gives us version control.

Jeff Bauer