views:

170

answers:

5

I have an .net app that seems to have a memory leak issue(s). The .net service starts out around 100MB of memory, but under load it hits around 400-500MB. Most of my classes don't have unmanaged resources, and the ones that do already implement IDisposable. So my question is would slapping IDisposable on my classes help?

The 4-500 MB isn't itself concerning. The concern is there are 8 different services. Each is built using SharpArch, NServiceBus, Windsor, and NHibernate. My feeling is that there is something in one of these that is causing a problem. My concern is that the total memory of all the services is around 3.2 to 3.6 gigs of memory out of 4 gigs. It is not throwing OutOfMemory exceptions yet, but I'd like to head this off at the pass. Also I've used dotTrace, which give me some information, I'm just not sure how to act on that information

+4  A: 

Short answer: No.

Longer answer: Noooooo.

I think you knew this already - if you didn't, you wouldn't have "slapped" IDisposable on the classes that did need it - IDisposable has nothing to do with the GC. The only thing that really matters here is if you have put Finalizers (~classname) on your objects needlesslessly - this will causes them to be backed up in the single-threaded finalizer queue before they get GC'd, regardless of them containing unmanaged resources or not.

x0n
+15  A: 

If all of the classes which have unmanaged resources implement IDisposable and are properly disposed of (via using or try/finally) then adding further IDisposable implementations won't help anything.

The first problem is that you don't know why you're leaking. Managed applications typically leak for one of the following reasons

  1. Not properly disposing of unmanaged resources
  2. Holding onto large object graphs of managed objects

Given the information in your question it's almost certainly #2 that is causing the problem. You'll need to get a profiler or windbg to tell you what the actual leak is and which rooted objects are causing it.

Here is a great article by Rico to get you started

JaredPar
+5  A: 

The answer is, almost certainly not. Have you verified that you're actually holding onto memory you shouldn't be, by profiling your service?

Bear in mind that the Garbage Collector may not necessarily release memory until it needs to, so it's not unusual for it may not be unusual for it to hit 400-500mb allocated. The point at which I'd become concerned would be when, after a [insert reasonable period of time here] of usage it's crept up further and has hit 1Gb, even though it's not been under any higher level of load.

Rob
+13  A: 

My first concern would be to ensure that you are measuring something relevant. "Memory" can mean a lot of different things. There is an enormous difference between running out of virtual memory space and running out of RAM. There is an enormous difference between a performance problem caused by thrashing the page file and a performance problem caused by creating too much GC pressure.

If you don't understand what the relationships are between RAM, virtual memory, working set and the page file then start by doing some reading until you understand all that stuff. The way you phrased the question leads me to suspect that you believe that virtual memory and RAM are the same thing. They certainly are not.

I suspect that the arithmetic you are doing is:

  • I have eight processes that each consume 500 million bytes of virtual address space
  • I have four billion bytes of RAM
  • Therefore I am about to get an OutOfMemory exception

That syllogism is completely invalid. That's the syllogism:

  • I have eight quarts of ice cream
  • I have room for nine quarts of ice cream in the freezer
  • Therefore if I get two more quarts of ice cream, something is going to melt

when in fact you have an entire warehouse-sized cold storage facility next door. Remember, RAM is just a convenient fast way to store stuff near where you need it, like your fridge. If you have more stuff that needs to be stored, who cares if you run out of room locally? You can always pop next door and put the stuff you use less frequently in long term deep freeze -- the page file. That's less convenient, but nothing melts.

You get an "out of memory" exception when a process runs out of virtual address space, not when all the RAM in the system is consumed. When all the RAM in the system is consumed, you don't get an error, you get crap performance because the operating system is spending all of its time running stuff back and forth from disk.

So, anyway, start by understanding what you are measuring and how memory in Windows works. What you should actually be looking for is:

  • Is any process in danger of using more than two billion bytes of virtual memory on a 32 bit system? A process only gets 2GB of virtual memory (not RAM, remember, virtual memory has nothing to do with RAM: that's why its called "virtual" -- it isn't hardware) on win32 that is addressible by user code; you'll get an OOM if you try to use more.

  • Is any process in danger of attempting to allocate a huge block of virtual memory such that there is no contiguous block of that size free? Are you likely to be allocating ten million bytes of data in a single array, for example? Again, OOM.

  • Is the working set -- that is, the virtual memory pages of a process that are *required to be in RAM for performance reasons -- of all processes smaller than the amount of RAM available? If not, then soon you'll get thrashing, but not an OOM.

  • Is your page file big enough to handle the virtual memory pages that could be paged out to disk if RAM starts to get short?

So far none of this has anything to do with .NET. Once you've actually determined that there is a real problem - there might not be - then start investigating based on what the real problem is. Use a memory profiler to examine what the memory allocator and garbage collector are doing. See if there are huge blocks in the large object heap, or unexpectedly big graphs of live objects that cannot be collected, or what. But use good engineering principles: understand the system, use tools to investigate the actual empirical performance, experiment with changes and carefully measure their results. Don't just start randomly slapping magic IDisposable interfaces on a few classes and hope that doing so makes the problem -- if there is one -- go away.

Eric Lippert
+2  A: 

Measure, measure, measure

If you want to cut down memory usage for your app you need to first determine where it is being used.

You can get a rough idea by adding a few perfmon counters on "private bytes, bytes in all heaps, bytes in the large object heap, gen 1, gen 2" and so on.

If you determine you are using too much managed memory. You can break down the usage even further using a tool like .Net memory profiler or the very flexible but windbg + sos

Once you isolate where your memory is going you can look at strategies to reduce usage, it could be something as simple as replacing a Dictionary with a Cache or adding a string builder.

The solution very unlikely to be sprinkle IDisposable on everything.

Sam Saffron
@Sam: I'd argue that before your first step, it's useful to decide whether or not you really have a problem.
John Saunders
@John totally agree with you on that
Sam Saffron