As mentioned in this SO answer, git gc
can actually increase the size of the repo!
This also this thread
Now git has a safety mechanism to not delete unreferenced objects right away when running 'git gc
'.
By default unreferenced objects are kept around for a period of 2 weeks. This is to make it easy for you to recover accidentally deleted branches or commits, or to avoid a race where a just-created object in the process of being but not yet referenced could be deleted by a 'git gc
' process running in parallel.
So to give that grace period to packed but unreferenced objects, the repack process pushes those unreferenced objects out of the pack into their loose form so they can be aged and eventually pruned.
Objects becoming unreferenced are usually not that many though. Having 404855 unreferenced objects is quite a lot, and being sent those objects in the first place via a clone is stupid and a complete waste of network bandwidth.
Anyway... To solve your problem, you simply need to run 'git gc
' with the --prune=now
argument to disable that grace period and get rid of those unreferenced objects right away (safe only if no other git activities are taking place at the same time which should be easy to ensure on a workstation).
And BTW, using 'git gc --aggressive
' with a later git version (or 'git repack -a -f -d --window=250 --depth=250
')
The same thread mentions:
git config pack.deltaCacheSize 1
That limits the delta cache size to one byte (effectively disabling it) instead of the default of 0 which means unlimited. With that I'm able to repack that repository using the above git repack
command on an x86-64 system with 4GB of RAM and using 4 threads (this is a quad core). Resident memory usage grows to nearly 3.3GB though.
If your machine is SMP and you don't have sufficient RAM then you can reduce the number of threads to only one:
git config pack.threads 1
Additionally, you can further limit memory usage with the --window-memory argument
to 'git repack
'.
For example, using --window-memory=128M
should keep a reasonable upper bound on the delta
search memory usage although this can result in less optimal delta match if the repo
contains lots of large files.
On the filter-branch front, you can consider (with cautious) this script
#!/bin/bash
set -o errexit
# Author: David Underhill
# Script to permanently delete files/folders from your git repository. To use
# it, cd to your repository's root and then run the script with a list of paths
# you want to delete, e.g., git-delete-history path1 path2
if [ $# -eq 0 ]; then
exit 0are still
fi
# make sure we're at the root of git repo
if [ ! -d .git ]; then
echo "Error: must run this script from the root of a git repository"
exit 1
fi
# remove all paths passed as arguments from the history of the repo
files=$@
git filter-branch --index-filter "git rm -rf --cached --ignore-unmatch $files" HEAD
# remove the temporary history git-filter-branch otherwise leaves behind for a long time
rm -rf .git/refs/original/ && git reflog expire --all && git gc --aggressive --prune