views:

44

answers:

2

I'm hosting a git repo on a shared host. My repo necessarily has a couple of very large files in it, and every time I try to run "git gc" on the repo now, my process gets killed by the shared hosting provider for using too much memory. Is there a way to limit the amount of memory that git gc can consume? My hope would be that it can trade memory usage for speed and just take a little longer to do its work.

+1  A: 

Yes, have a look at the help page for git config and look at the pack.* options, specifically pack.depth, pack.window, pack.windowMemory and pack.deltaCacheSize.

It's not a totally exact size as git needs to map each object into memory so one very large object can cause a lot of memory usage regardless of the window and delta cache settings.

You may have better luck packing locally and transfering pack files to the remote side "manually", adding a .keep files so that the remote git doesn't ever try to completely repack everything.

Charles Bailey
A: 

You could use turn off the delta attribute to disable delta compression for just the blobs of those pathnames:

In foo/.git/info/attributes (or foo.git/info/attributes if it is a bare repository) (see the delta entry in gitattributes and see gitignore for the pattern syntax):

/large_file_dir/* -delta
*.psd -delta
/data/*.iso -delta
/some/big/file -delta
another/file/that/is/large -delta

This will not affect clones of the repository. To affect other repositories (i.e. clones), put the attributes in a .gitattributes file instead of (or in addition to) the info/attributes file.

Chris Johnsen