views:

95

answers:

6

I have an application written using .NET 3.5 SP1 that downloads images from an external site and displays them to end users. On rare ocassions, my users are experiencing OutOfMemory errors because they are downloading enormous images. Sometimes the raw data associated with these images are large, but more often, the dimensions of the images are huge. I realize that I may never be able to get around the fact that these OOM errors are thrown for particular images. It would be VERY helpful, however, if I could somehow determine whether loading a particular image would lead to an OOM issue before I try to load the image.

The data for the images is loaded into a Stream, and then the image itself is turned into a System.Drawing.Image by making a call to System.Drawing.Image.FromStream(stream). I do not have the option of storing these images on disk first. They must be loaded via memory.

If anyone has any tips or suggestions that would allow me to detect that loading an image would lead to an OOM exception, I would very much appreciate it.

+1  A: 

OutOfMemory is one of those exceptions where you don't have a lot of good options. Anything that would predict conclusively that you're going to get the exception would probably have to just generate the exception.

I would say that your best bet is to profile the behaviour and create your own predictive ruleset or simply hard code maximum sizes into your application. It ain't pretty, but it'll get you there.

Mike Burton
+1  A: 

You might look at this question and see if it helps: http://stackoverflow.com/questions/552467

The idea there would be to download only part of the total image (the header specifically) so you can read the metadata. Then you can use the information there to determine how big the image is, and deny it from completely downloading if you see it will be too big.

On the downside, it seems like you'd have to write a method to decompose the binary of each file type you want to be able to handle.

CodexArcanum
Yeah, we were thinking about using this approach. Our biggest concern is still that we don't have a good way to accurately predict how much memory System.Drawing.Image.FromImage will use. We need to support any image type that .NET natively supports. On top of needing to decompose the binary of each file type .NET supports, we'd need to somehow come up with a method that accurately predicts the amount of memory used when those images are loaded. This seems like something an entire team of developers could be working on for months! :(
Keith
+1  A: 

You've got a chicken-and-egg problem. To make some kind of guess, you need to know the size of the image. You don't know the size until you loaded it.

It isn't really helpful anyway. Whether you get OOM really depends on how fragmented the virtual memory address space has gotten. And that is not easy to find out in Windows. The HeapWalk() API function is required and that's an unhealthy function to use. Check out the small print in the MSDN library article for it. Especially bad in a managed program, don't use it.

Note that this OOM exception is not the same kind of OOM you'd get when you used up too much managed memory. It is actually a GDI+ exception and you can easily recover from it. Just catch the exception and display a "Sorry, couldn't do it" message.

If you do know the size of front somehow then you can pretty safely assume up front that width * height * 4 > 550 MB is not going to work in a 32-bit program. This limit goes down quickly after running for a while.

Hans Passant
+1  A: 

If you're downloading the images from an external site, and the external site sets the Content-Length HTTP header, you might be able to estimate if the image is going to fit in memory before you even start downloading the stream...

Joel Mueller
We do get the content-length header. However, the sheer physical size of the data is not a great predicter of how much memory loading a file will use. I have seen files that are less than 5MB that chew up over 800MB when loaded because their dimensions are so large.
Keith
+4  A: 

You can use MemoryFailPoint class to check for memory availability.

Best

Vagaus
Please everyone read how this works, it is by FAR the best way to deal with large memory allocations in a "try/catch" manner.
Spence
I'll look into this. A quick glance makes me think it might not be what I need as the documentation says I would need to specify "the number of megabytes of memory that the operation is expected to use". The problem is that I don't know how much memory GDI+ is going to use when it loads the image using System.Drawing.Image.FromStream.
Keith
+1  A: 

I have already agreed with @Vagaus answer, but I wanted to add that you should only allocate the buffer once and try to reuse it. If you are constantly allocating and releasing a large buffer you will definitely hit an OOM issue due to fragmentation on the heap.

Spence
We do make use of this technique when we're downloading these iamges. However, I'm pretty sure that System.Drawing.Image.FromImage makes use of its own internal buffer which leads to heap fragmentation anyway.
Keith
Are you calling ALL the disposers on the GDI stuff? I've been burned before, almost every little class in the GDI+ libraries requires a dispose call to release the unmanaged memory. Perhaps you are leaking memory because of this, causing the OOM?
Spence
I'm confident that we're disposing all of our objects properly. We have users who can spend hours looking at normal sized images and the memory footprint remains steady. There's just the occassional image with huge dimensions that gobbles up all the remaining memory.
Keith