views:

209

answers:

5

We have a 64bit C#/.Net3.0 application that runs on a 64bit Windows server. From time to time the app can use large amount of memory which is available. In some instances the application stops allocating additional memory and slows down significantly (500+ times slower).When I check the memory from the task manager the amount of the memory used barely changes. The application keeps on running very slowly and never gives an out of memory exception. Any ideas? Let me know if more data is needed.

+5  A: 

The moment you are hitting the physical memory limit, the OS will start paging (that is, write memory to disk). This will indeed cause the kind of slowdown you are seeing.

Solutions?

  • Add more memory - this will only help until you hit the new memory limit
  • Rewrite your app to use less memory
  • Figure out if you have a memory leak and fix it

If memory is not the issue, perhaps your application is hitting CPU very hard? Do you see the CPU hitting close to 100%? If so, check for large collections that are being iterated over and over.

Oded
It appears from the question that there is sufficient physical memory available but the app is for some reason not using it...
kzen
nope, cpu doesn't get fully utilized and in one machine it was using 2.8- 3GB of 8GB memory and in another case it was using 7-8GB of 16GB.
derdo
+2  A: 

As with 32-bit Windows operating systems, there is a 2GB limit on the size of an object you can create while running a 64-bit managed application on a 64-bit Windows operating system.

Investigating Memory Issues (MSDN article)

kzen
The OP stated 64bit OS and 64bit app.
Oded
The limit still applies... "...while running a 64-bit managed application on a 64-bit Windows operating system."
kzen
In some ways the limit is even more constraining since a reference in a 64-bit app takes 8 bytes and in 32-bit only 4 bytes... In other words, you can have only 1/2 the number of refrences in a 64-bit app than you could in a 32-bit...
kzen
we don't have any objects that are that big. We just have many of them. We create and dispose many small objects (arrays of double).
derdo
+3  A: 

You might try enabling server mode for the Garbage Collector. By default, all .NET apps run in Workstation Mode, where the GC tries to do its sweeps while keeping the application running. If you turn on server mode, it temporarily stops the application so that it can free up memory (much) faster, and it also uses different heaps for each processor/core.

Most server apps will see a performance improvement using the GC server mode, especially if they allocate a lot of memory. The downside is that your app will basically stall when it starts to run out of memory (until the GC is finished).

* To enable this mode, insert the following into your app.config or web.config:

<configuration>
   <runtime>
      <gcServer enabled="true"/>
   </runtime>
</configuration>
Aaronaught
i had difficulty in finding how to set server mode for garbage collector on the web. if you know could you add it to answer?
derdo
@derdo: It's explained in the link, but I've reprinted it in this answer.
Aaronaught
GC server mode helped us a great deal. Also we identified places where we were allocating many GB of memory per second through a profiler. We made some improvements on that front too and eventually overcome this problem. Thanks for everybody's responses.
derdo
+1  A: 

There is an awful lot of good stuff mentioned in the other answers. However, I'm going to chip in my two pence (or cents - depending on where you're from!) anyway.

Assuming that this is indeed a 64-bit process as you have stated, here's a few avenues of investigation...

Which memory usage are you checking? Mem Usage or VMem Size? VMem size is the one that actually matters, since that applies to both paged and non-paged memory. If the two numbers are far out of whack, then the memory usage is indeed the cause of the slow-down.

What's the actual memory usage across the whole server when things start to slow down? Does the slow down also apply to other apps? If so, then you may have a kernel memory issue - which can be due to huge amounts of disk accessing and low-level resource usage (for example, create 20000 mutexes, or load a few thousand bitmaps via code that uses Win32 HBitmaps). You can get some indication of this on the Task Manager (although Windows 2003's version is more informative directly on this than 2008's).

When you say that the app gets significantly slower, how do you know? Are you using vast dictionaries or lists? Could it not just be that the internal data structures are getting so big so as to complicate the work any internal algorithms are performing? When you get to huge numbers some algorithms can start to become slower by orders of magnitude.

What's the CPU load of the application when it's running at full-pelt? Is actually the same as when the slow-down occurs? If the CPU usage decreases as the memory usage goes up, then that means that whatever it's doing is taking the OS longer to fulfill, meaning that it's probably putting too much load on the OS. If there's no difference in CPU load, then my guess is it's internal data structures getting so big as to slow down your algos.

I would certainly be looking at running a Perfmon on the application - starting off with some .Net and native memory counters, Cache hits and misses, and Disk Queue length. Run it over the course of the application from startup to when it starts to run like an asthmatic tortoise, and you might just get a clue from that as well.

Andras Zoltan
+1  A: 

Having skimmed through the other answers, I'd say there's a lot of good ideas. Here's one I didn't see:

Get a memory profiler, such as SciTech's MemProfiler. It will tell you what's being allocated, by what, and it will show you the whole slice n dice.

It also has video tutorials in case you don't know how to use it. In my case, I discovered I had IDisposable instances that I wasn't Using(...)

Carlos