We have recently started using git and had a nasty problem when someone committed a large (~1.5GB file), that then caused git to crash on various 32bit OSes. This seems to be a known bug (git mmaps files into memory, which doesn't work if it can't get enough contingous space), which isn't going to get fixed any time soon.
The easy (for us) solution would be to get git to reject any commits larger than 100MB or so, but I can't figure out a way to do that.
EDIT: The problem comes from accidental submission of large file, in this case a large dump of program output. The aim is to avoid accidental submission, just because if a developer does accidentally submit a large file, trying to then get it back out the repository is an afternoon where no-one can do any work, and has to fix up all local branches they have.