tags:

views:

2935

answers:

4
+10  Q: 

SVN Obliterate!

I was just thinking of writing a shell script to implement the obliteratee functionality in an easy to do way (externally, using the suggested way, but automated).

Here's what I had in mind:

On the client

  1. svn list -R > file-list.
  2. filter file-list in several ways like grep to create a file "files-to-delete", something like a set of grep XXX file-list>>files-to-delete.
  3. transfer files-to-delete to the server using scp.

On the server

  1. Dump the repository svnadmin dump /path/to/repos > repos-dumpfile, this can be kept as a backup too.
  2. Filter the dump file, for each word in "files-to-delete", do: cat repos-dumpfile | svndumpfilter exclude $file > new-dumpfile
  3. Create a new repository and load the new file to it svnadmin create new-name; svnadmin load new-name < new-dumpfile

Would this work? How can it fail? Any other ideas?

Thanks.

A: 

So you're making a new repository any time you want to filter something out? Why not just delete the filtered files from the repository?

tehvan
Obliterate removes the files entirely. Just deleting them will mean that you don't usually see them, but they're still there.
drby
Obliterate removes the actual old versions, I could restore it to the same repository after deleting and re-creating it.
Osama ALASSIRY
We could also use this to remove all exe,avi,wmv .... files from a repository by filtering them.
Osama ALASSIRY
Obliterate is necessary if you accidentally commit something you shouldn't have, for example logins/passwords/keys/...
Stefan
+3  A: 

Yes, that script would work. But usually you don't obliterate that many files. Usually obliterate is only needed if you commit confidential information accidentally.

Are you sure you want to use obliterate for so many files?

Stefan
+3  A: 

I think cat new-dumpfile | svndumpfilter exclude $file > new-dumpfile is a dangerous example. new-dumpfile will not be completely processed and it's contents will be probably lost, no?

From the comments below: the new-dumpfile will surely be lost, because the shell will clobber (truncate to zero length) it even before starting up the command.

mark
why do you say "will not be completely processed and it's contents will be probably lost"?
Osama ALASSIRY
I think that, with a large file, reading and simultaneously writing to it will not work. I guess even for small files, no? "cat file | do_something > file" looks like it will overwrite itself which can happen before being completely processed. Methinks.
mark
Yes, definitely, this will clobber new-dumpfile. It doesn't matter how big it is: The shell will set up the output redirection to new-dumpfile before even starting svndumpfilter, and will truncate new-dumpfile.
sleske
I think, perhaps, it was a typo and he just meant 'cat dumpfile ... $file > new-dumpfile' otherwise why is called *new-dumpfile* and not just *dumpfile*?
bias
Downvoted for long-unfixed sure data corruption...
EFraim
@EFraim: wait, you downvote my answer because I'm pointing at the obvious data corruption?
mark
Sorry, wanted to downvote the original...
EFraim
@EFraim: :-) thx
mark
A: 

What about files with the same path in different revisions? For example, if you commit /trunk/foo, then rename it to /trunk/bar, then commit something else at /trunk/foo that you want to obliterate. You don't want to lose the history of what's now /trunk/bar. Maybe svndumpfilter supports peg revisions?

Matt McHenry