views:

59

answers:

1

Hello everybody,

I stumbled upon an OutOfMemory-Exception when working with lots of pictures (sequentially, not in parallel). I reproduced the behavior in some small portion of code like this:

class ImageHolder
{
    public Image Image;

    ~ImageHolder()
    {
        Image.Dispose();
    }
}

public partial class Form1 : Form
{
    public Form1()
    {
        InitializeComponent();
    }

    private void Form1_Load(object sender, EventArgs e)
    {
        for (int i = 0; i < 1000; i++)
        {
            ImageHolder h = new ImageHolder() { Image = new Bitmap(1000, 1000) };
        }
    }
}

The memory usage rises and rises until I get an exception (sometimes ArgumentException, sometimes OutOfMemory Exception).

My question is NOT what I can do about this (I could implement IDisposable in ImageHolder and use a using-block, for example).

My question is rather: Why doesn't garbage collection destroy my objects of type ImageHolder (the destructor is never called), because there's no reference on them and I'm running out of memory!

Thanks for an explanation,

Philipp

+7  A: 

The Bitmap class is a managed class wrapper around a big chunk of unmanaged code called GDI+. The wrapper itself uses very little memory, the actual bitmap pixels are stored (by the unmanaged code) in unmanaged memory. The garbage collector cannot touch that memory, it can only see the wrapper. This is also why Bitmap has a Dispose() method, it frees that unmanaged memory.

The OOM you get is GDI+ telling the wrapper that it can't allocate the unmanaged memory anymore. Yes, or ArgumentException when GDI+ randomly decides that the Width or Height you pass is too large instead of throwing OOM. GDI+ is pretty notorious for throwing uninformative exceptions.

The finalizer isn't called because your program bombs on the GDI+ exception first. The memory allocation that failed was not one from the garbage collected heap, it was the unmanaged code that couldn't allocate anymore.

The finalizer code is wrong, by the time your finalizer runs, the bitmap is probably already finalized itself. You must instead have ImageHolder implement IDisposable, like this:

    class ImageHolder : IDisposable {
        public Image Image;

        public void Dispose() {
            if (Image != null) {
                Image.Dispose();
                Image = null;
            }
        }
    }

Now you've got a shot at preventing OOM:

        for (int i = 0; i < 1000; i++) {
            using (var h = new ImageHolder() { Image = new Bitmap(1000, 1000) }) { 
                // do something with h
                //...
            }
        }

If you really do need to store a thousand of those large images then you'll need a machine that can provide 1000 x 1000 x 1000 x 4 = 4 gigabytes of virtual memory. That's possible, a 64-bit operating system can give you that.

The generic rule of thumb to keep you out of trouble like this is that it is extremely rare to implement your own destructor. It is the job of the .NET classes to provide wrappers around unmanaged resources. Like Bitmap. Those wrapper classes have a finalizer, you don't need (and should not) provide your own. The 99.99% case is that you need to implement IDisposable so you can call Dispose() on the .NET class instances. Even if you would be tempted to manage your own operating system resources then you still don't. You should use one of the SafeHandle wrappers.

Hans Passant
+1 for explaining why finalizers are almost always wrong. Personally I only use them in debug code to notify me if I forgot to Dispose something.
CodeInChaos