views:

105

answers:

6

A memory intensive program that I wrote ran out of memory: threw an OutOfMemory exception. During attempts to reduce memory usage, I started calling GC.GetTotalMemory(true) (to write the total memory usage to debug file), which triggers a garbage collect.

For some reason, when calling this function I don't get an out of memory exception anymore. If I remove the calls again (keeping everything else the same), the exception gets thrown again. In my understanding, calls are automatically made to collect garbage when memory pressure increases, so I don't understand this behavior.

Can anyone explain why the out of memory exception is only thrown when there are no calls to GC.collect?

Update:

I'm using VS 2010, but I'm downtargeting the application to framework 3.5. I believe that defragmentation is indeed causing my problems.

I did some tests: When the exception is thrown, a call to GC.gettotalmemory tells me I am using ~800 * 10^6 bytes. However, task manager tells me the application is using 1700 mb. A rather large discrepancy. I'm now planning to allocate memory only once, and to never deallocate any large arrays but reusing them. Luckily, my program allows me to accomplish this without too much fuss.

+2  A: 

Is your app running at full CPU? I'm pretty sure automatic garbage collection only occurs when the application is idle. Otherwise, you have to run a manual cycle.

Tim
It is indeed running at Full CPU. But I found at other sources that "the garbage collector is optimized to perform the memory free-up at the best time based upon the allocations being made", and "you can force garbage collection [..] this is not recommended"http://www.c-sharpcorner.com/UploadFile/DipalChoksi/UnderstandingGarbageCollectioninNETFramework11292005051110AM/UnderstandingGarbageCollectioninNETFramework.aspxI read something similar in a c# book.
willem
Making the call to GetTotalMemory with true probably allows it to make a garbage collection since you are calling it explicitly, rather than it wait until it has some idle CPU time which it never does and thus runs out of memory. It might also be that the frequent explicit calls for a GC are making the pattern of fragmentation such that it's not a problem, since fragmentation is kind of random in terms of whether the sequence of allocation/deallocation leaves objects in the middle of needed contiguous spaces.
AaronLS
A: 

GC.Collect is merely a "suggestion" to free unused memory - it does not guarantee its release.

[Edit] It appears that, while once true when I was learning the JVM years ago, this may not be the case in .NET anymore. The MSDN Library says that GC.Collect "Forces an immediate garbage collection of all generations." Good stuff (for me, anyway) about this here.

SethO
A: 

If you have unmanaged resources that occupy a lot of memory, the garbage collector won't really recognize that memory pressure. If you clean up those resources in finalizers, then forcing a collection will result in those unmanaged resources being freed, while if you don't force a collection, the garbage collector might not realize that it needs to be collecting.

If you are performing large unmanaged allocations, you can use GC.AddMemoryPressure to tell the GC that, so it can take it into account when deciding whether to run a collection or not.

Anon.
All unmanaged resources (a streamreader, and a external c++ dll) are embedded in objects that inherit IDisposable and released when Dispose is called, the objects are created in USING statements, and those objects are out of scope when the outofmemory exception is thrown.. This rules out this possibility, or am I missing something?
willem
Are you calling Dispose on everything that implements IDispose? If you rely on the GC to dispose of large structures like Bitmaps, you're in for trouble. Always wrap them in a using statement.
Tim
+1  A: 

I'm fairly sure that running out of memory does not force a garbage collection. That probably sounds incredibly unintuitive to you but I think this was done for a good reason. It prevents the program from entering a death-spiral where it constantly tries to find more space and getting all objects firmly lodged into gen #2. From which it is very hard to recover again.

The true argument you pass to GetTotalMemory() forces a full garbage collection. I would guess that this happens to free up enough space in the Large Object Heap to satisfy the memory allocation. This will of course work only once. If your program just keeps running, gobbling up memory beyond the 1.5 gigabytes or so that it has already consumed then OOM is just around the corner again. This time without any way to recover. Surviving an OOM requires drastic measures.

You'll need a good memory profiler to find out what's really going on. Unmanaged C++ in your project is always a fertile source of memory leaks. The unmanaged kind, always hard to trouble-shoot.

Hans Passant
To say running out of memory does not force a garbage collection is sort of true and not true. Once you have run out of memory such that a request to allocate memory does can not be honored, then it is indeed too late. However, as you near a point of running out of memory, the "pressure" for more memory should trigger a garbage collection. I don't mean to split hairs, but just wanted to add that for those that read this.
AaronLS
The "should" clause is what my post is all about. It shouldn't.
Hans Passant
A: 

I was browsing around the Microsoft Connect site and I am seeing bug reports where people are making the same claim you are. The claim being that an OutOfMemoryException is occuring which can be resolved by periodically calling GC.Collect. I saw one report where the lead engineer from the garbage collector team responded back and said a bug was fixed in .NET 4.0 that should resolve a fragmentation issue with the large object heap. That is why I asked what version you were using.

It is certainly possible that you have stumbled upon a bug in the garbage collector. As with all GC related issues this could be very version dependent.

My advice would be to:

  • make sure you have the latest patches and service pack
  • refactor the code so that it is not as memory intensive
  • reuse LOH objects as much as possible instead of creating new ones
  • continue using GC.Collect at strategic points if necessary as a workaround
Brian Gideon
+1  A: 

I solved the problem by doing some smarter memory management. In particular by using a CustomList according to the suggestions on http://www.simple-talk.com/dotnet/.net-framework/the-dangers-of-the-large-object-heap/

willem
Sometimes streaming data to temp files is appropriate. I experienced this same problem with the large object heap when encrypting large amounts of data, and solved it by streaming the input and output from/to temp files. The file IO is slower, but slow is better than something that occasionally crashes.
AaronLS