Most programmers agree that garbage collection is a great thing, and in most applications is well worth the overhead. However, my personal observation is that memory management for most objects is trivial, and maybe 10%-20% of them account for the need for kludges such as reference counting and really complicated memory management schemes in general. It seems to me like one could get all the benefits of garbage collection with only a small fraction of the overhead by conservatively deleting large objects manually where the object's lifetime is obvious and letting the GC collect the rest, assuming the GC implementation supports such a thing. This would allow the GC to run much less frequently, and consume less excess memory, while still avoiding cases that are actually hard to manage manually. Even more interesting would be if the compiler inserted deterministic delete statements automatically where lifetimes were obvious:
int myFunc() {
Foo[] foo = new Foo[arbitraryNumber]; // May be too big to stack allocate.
// do stuff such that the compiler can prove foo doesn't escape.
// foo is obviously no longer needed, can be automatically deleted here.
return someInteger;
}
Of course, this might not work well with a copying GC, but for the sake of this post let's assume our GC isn't copying. Why are such hybrid memory management schemes apparently so rare in mainstream programming languages?