views:

188

answers:

4

I have a class A providing Bitmaps to other classes B, C, etc.

Now class A holds its bitmaps in a ring queue so after a while it will lose reference to the bitmap.

While it's still in the queue, the same Bitmap can be checked out by several classes so that, say, B and C can both hold a reference to this same Bitmap. But it can also happen that only one of them checked out the Bitmap or even none of them.

I would like to dispose of the bitmap when it's not being needed any more by either A, B or C.

I suppose I have to make B and C responsible for somehow signaling when they're finished using it but I'm not sure about the overall logic.

Should it be a call to something like DisposeIfNowOrphan() that would be called, in this example, three times :

1 - when the Bitmap gets kicked out of the queue in class A

2 - when B is finished with it

3 - when C is finished with it

If that's the best strategy, how can I evaluate the orphan state ?

Any advice would be most welcome.

+2  A: 

Have class A provide a wrapper class instead of a bitmap directly. The wrapper class should implement IDisposable itself and can be used to maintain a counter. Each consumer can get their own wrapper which references the same Bitmap. Class A keeps trap of all bitmaps and all wrappers. Use a WeakReference in Class A to keep track of the wrappers so if a consumer doesn't call dispose, it will get GC'd and the provider can know it's no longer referenced.

Sam
Reference-counting is the right idea. WeakReference is not though. That's a backup for code forgetting to call Dispose. Which keeps the Bitmap undisposed too long, creating the exact problem that the OP is trying to solve. Ref-counting *requires* a Dispose call, detecting a missing one should generate an exception.
Hans Passant
@nobugz, I agree, the WeakReference is a stop gap. There is no way to **require** the caller from calling Dispose though. Logging in the finalizer would be good to have, but I wouldn't argue to ignore the WeakReference just 'cause it shouldn't get used. Requiring Dispose could end up with a leak which the WeakReference would avoid (which is good in case the leak is not caught in dev/testing but only surfaces in production).
Sam
Well, there is: "I'll bomb the program if you don't do it". Ultimately this is the weakness of reference counting schemes, it is really better to just not do this.
Hans Passant
@nobugz, always good to have different opinions.
Sam
A: 

Bitmap inherits from Image, which implements IDisposable, so when you're done using an instance, you should call Dispose() on it. This will clean up the unmanaged resource in Image.

However, Image also implements a finalizer, so if for some reason you cannot call Dispose(), the resource will be reclaimed during finalization of the instance, which will happen at some point after the instance is no longer referenced.

Brian Rasmussen
I think the question was how to know when to call Dispose() when you have multiple consumers sharing the same bitmap.
Hightechrider
Yes, I'm trying to dispose of the bitmap as soon as it stops being in use by its consumers.
Jelly Amma
@Jelly: So I figured, but I also got the impression that you couldn't determine when the object was no longer needed and thus would not be able to call Dispose(). If that is the case, the finalizer will act as a fall back when the instance is no longer referenced.
Brian Rasmussen
+1  A: 

If memory usage isn't such a big issue and correctness and clarity are more important ...

Give each recipient their own copy of the bitmap and have a using() statement around the code that uses it.

Your management code is now very easy, and your consumption code is also very easy. It's also very easy to see (prove?) that the whole thing will work even when your consumers might have exceptions and other code paths making it hard (or impossible) to be sure that reference counters were decremented (or such like).

Using the time you have saved developing your own GC solution for shared bitmaps, take the money and buy another stick of RAM for your server.

Hightechrider
Unfortunately memory is a main issue as there will be many instances of this image provider class A and I need to trim the memory consumption to avoid relying on the GC and getting performance spikes. Sorry I should have explained that this is for real time, not a server app.
Jelly Amma
It's not necessarily the case that this approach is 'worse' from a 'time spent in GC' perspective. If these cloned objects have short life times such that they are booted during a GC(0) then the cost to garbage collect them will be small. In any case since they will likely be on the LOH they aren't going to be moved around in memory after allocation.
Hightechrider
A: 

If, on the other hand, peak memory consumption is the key issue ... but you still want a 'safe' approach where you can be sure that bitmap lifetimes are being managed properly regardless as to the consumer code you could invert the problem and say that the producer is solely responsible for all operations on the images in its own thread (or threads). So instead of handing out images to other classes to work on you ask the other classes to hand in Actions to carry out on images. You maintain a queue of pending actions and can look ahead in the queue to decide which images to toss from the buffer based on having no future work to do on them.

Since these images will likely be on the large object heap it's important to manage their lifetimes appropriately to minimize fragmentation of the large object heap.

Hightechrider