tags:

views:

158

answers:

3

Simple question, I'm trying to run git gc on a machine with a quota. Pre-gc, I'm at about 18GB of usage, almost all of which is my cloned git repository. My disk limit is 25GB. During the git gc operation, enough temporary files are written to disk to hit my limit and thus cause the git gc operation to fail.

I can delete the .git/objects/pack/tmp_pack_* after the failed operation and get back down to 18GB of usage, but I'd really like to actually complete git gc and reclaim a little performance.

Is there some fancy option (or series of other git commands) that I can use that doesn't involve first writing >7GB of temporary files to the disk?

A: 

Looks like the best solution is to ask IT for a bigger quota, though I'm still interested to hear workarounds. Today they were feeling generous, but tomorrow... :)

Cory Petosky
+2  A: 

Which part of the gc is important for you? You could try running the git-prune and git-repack parts separately. With git-prune, be sure and specify a cutoff date with the --expire option. If there are a ton of loose objects in your repo, getting them out of the way first would be helpful.*

With git-repack, you can perhaps mess with the depth and window settings to get something small enough to run within the space you have.

* I don't pretend to fully understand all the issues involved, but I do notice that the prune comes after the repack in the git-gc code.

Jefromi
A: 

git prune is an excellent suggestion, good call Jefromi.

Another thing you can do is compress your other files ; git gc; decompress.

7z gets amazing compression, but won't preserve hardlinks and Unix owner/group, IIRC. If you have a quota, the files are probably all owned by your account, so no worries on the first score. But the safest is tar c --lzma if you have it, or just tar czf or cjf. If you have a lot of small files, the umm, external(?) fragmentation up to block-size might be significant.

Peter Cordes