tags:

views:

92

answers:

4

I need to load large amounts of bitmaps into memory for display in a WPF app (using .net 4.0). Where I run into trouble is when I approach around 1,400MB of memory ( I am getting this from the process list in the task manager).

This same thing happens whether the app is run on a machine with 4GB of memory or 6GB (and some other configs that I do not have the details on). It is easy to test by reducing the images loaded and when it works on 1 machine then it works on them all, but when it crashes on one it also does on all.

When I reduce the image count and allow the app to load without causing the memory exception I can run multiple instances of the app (exceeding the 1.4GB of the single instance) without the problem so it appears to be some per instance limit or per instance error on my part.

I load the images as a BitmapImage and they are either stored in a List<BitmapImage> or loaded into a List<byte[]> where they are later used in a bunch of layered sequences (using a Writeablebitmap)

The error occurrs when I load the images not while in use. In the repeatable case I load 600 640X640 images plus another 200-300 smaller images ranging from 100X100 to 200X200, although it appears to be an overall bit count that is the problem.

So my questions are:

*Is there some built in per process memory limit in a situation like this?

*Is there a better technique to load large amounts of image data into memory?

Thanks, Brian

+1  A: 

Below may be a cause but i am not sure

Problem is not about loading large amout of data but because CLR maintains a Large Heap for object greater than 85k of memory and you don't have any control to free this large heap.

and these objects became Long Lived and will normally deallocated when Appdomain Unloads.

i would suggest that try to load larger images in another AppDomain and use that appdomain to manupulate larger images.

See this MSDN Entry to Profiling GC

See if Memory Mapped Files helps in case you are using .net 4.0

And more example

saurabh
I agree. Seems like memory fragmentation issue.
Nayan
+1  A: 

A x86 build can access 4 GB on 64 bit Windows, so that's the theoretical upper limit for the process. This requires the application to be large address aware. Additionally .NET imposes a 2 GB limit on a single object.

You may be suffering from LOH fragmentation. Objects larger than 85000 bytes are stored on the Large Object Heap, which is a special part of the managed heap that doesn't get compacted.

You say that the images are 600x600, but what is the pixel format and is there a mask as well? If you use a byte per color channel plus a byte for the alpha channel each picture is 600x600x32, so trying to load 600 of them at once will be a problem in a 32 bit process.

Brian Rasmussen
Brian, can you please provide any MSDN link which states that ".NET imposes 2GB limit on single object"? Thanks!
Nayan
Got it. http://blogs.msdn.com/b/joshwil/archive/2005/08/10/450202.aspx
Nayan
Related question http://stackoverflow.com/q/1087982/38206
Brian Rasmussen
I was able to get around the issue by using the solution provided by Arun below (see LARGEADDRESSAWARE). It sounds like maybe I should switch to 64bit application when using as many images as I am. Yes, they are 32bit per pixel images.
Brian
@Brian: For some odd reason I assumed you had that part already. My mistake. I have updated my answer to include large address aware. Thanks.
Brian Rasmussen
A: 

You're running into the limitation 32 bit processes which can only access about 2Gb of data. If you were to run 64 bit you wouldn't have the issues. There are a number of ways to work around the issue, some of which are:

  • Simply don't load that much data, load only when needed. Use caching.
  • Use memory mapped files to map whole chucks of data into memory. Not recommended as you'll have to do all the memory management yourself.
  • Use multiple processes to hold the data and use an IPC mechanism to only bring over the data you need, similar to item 1.
Joel Lucsy
+1  A: 

Yes, there is a limit on per process memory allocations. One of the solutions is to make your binary LARGEADDRESSAWARE to use up more memory.

Refer Out of memory? Easy ways to increase the memory available to your program, it has great discussion around solutions to this.

Arun Mahapatra
Thanks, this solution solved the problem. I had the same 1.4GB limit as the article mentions. Now I am close to 2GB without a problem using the LARGEADDRESSAWARE setting.
Brian