tags:

views:

216

answers:

2

Dear All, It must be a pretty common requirement. I have an svn repository having around 1GB data size. To backup I first do a hotcopy into a directory at 12:00AM and then at 1:00 AM, a remote machines pulls this backup using rsync.The source machine on which the svn is installed has rsyncd and is a windows machine.

So, everything is working fine, just that everyday there is too much data transfer. Even if there is a single commit of few bytes. It transfers more 100 MB of files. My guess is the svn is renaming the files too frequently.

So, what should I do in such a case?

Is there any option in rsync which can just detect the change based on content rather than file names? Or is there any option in svn with which it would not do so many renames? As far I remember, there were two kind of Database options. Perhaps I am using fsfs.

Any ideas? Regards, Sandeep Giri

+1  A: 

You should use svndump do do the backup. Or have a look at the backup section of Subversion Tools.

tangens
+1  A: 

Is rsync accessing a FAT filesystem?

One of the considerations when using rsync with Windows FAT files is that modification times for files is only resolved to about one second accuracy.

Files may be transferred when they don't need to be, because rsync believes the file has been modified.

For this reason, rsync has a --modify-window=NUM option, and a value of 1 would allow modification times to differ by up to one second.

pavium
Nop. This doesnt resolve.
I don't believe `rsync` has an option which looks for changes in file *content*, but it certainly would be influenced by changes in modification date. Just looking at the `man` page for rsync, I notice there's a --cvs-exclude (or -C) option which makes rsync ignore changes which CVS would ignore. The exclude list 'includes' .svn files. Might be worth trying.
pavium