views:

46

answers:

1

Here's my situation. I want an explanation of why this is happening. I am reading about GC here but I still don't get it.

Workstation case: When I run with workstation garbage collection, my app ramps up to use about 180MB of private bytes and about 70MB in ".NET CLR Memory #bytes in all heaps". Memory continues to be stable for several hours. Life is good.

Server case: When I run with server garbage collection my app ramps up to use about 500MB of private bytes but still only about 70MB in ".NET CLR Memory #bytes in all heaps". An analysis of the !DumpHeap -stat output and !GCRoot shows a lot of objects without roots. Also, my private bytes increase significantly over several hours but the .NET bytes remain constant. My app DOES use a lot of unmanaged code so I'm thinking this is related given the difference in private and .NET bytes. But why is my life so bad in the server case?

Any GC wisdom or guidance on further investigation?

Thanks!

+4  A: 

"Server garbage collection" is designed for high-throughput applications, primarily on clustered servers.

A server GC is expensive, and suspends running threads while it happens. As such, it takes a lot more memory pressure before it actually triggers - if you've still got spare memory, don't be surprised if the garbage collector doesn't feel the need to go through and clean up yet.

Anon.