tags:

views:

153

answers:

3

I have an S3 Bucket that holds static content for all my clients in production. I also have a staging environment which I use for testing before I deploy to production. I also want the staging environment to point to S3 to test uploads and other functions. Problem is, I don't want the staging server to reference the same production s3 bucket/folder, because there is a risk of overriding production files.

My solution is to use a different folder within the same bucket, or create a different bucket all together that I can refresh periodically with the contents of the production bucket. Is there a way to easily sync two folders or buckets on Amazon S3?

Any other suggestions for managing this type of scenario would also be greatly appreciated.

+1  A: 

Check out CloudBerry Explorer and it's ability to sync data between local computer and Amazon S3. Might not exactly what you want but will help you to get started. More info here

cloudberryman
Funny thing you mentioned this. It's exactly what I ended up using :) It still would be nice to have a command line tool that I could execute with a cron job since I have more than 20,000 files!
Zakir Hemraj
you can use CloudBerry Explorer PowerShell interface to automate the whole stuff
cloudberryman
+1  A: 

CloudBerry Explorer comes with PowerShell command line interface and you can learn here how to use it to do sync.

cloudberryman
A: 

s3cmd is a nice CLI utility you can use in a cronjob. It even as a sync feature similar to *nix rsync.

Jahufar