views:

510

answers:

3

I recently came across a piece of Java code with WeakReferences - I had never seen them deployed although I'd come across them when they were introduced. Is this something that should be routinely used or only when one runs into memory problems? If the latter, can they be easily retrofitted or does the code need serious refactoring? Can the average Java (or C#) programmer generally ignore them?

EDIT Can any damage be done by over-enthusiastic use of WRs?

+10  A: 

You use them whenever you want to have a reference to an object without keeping the object alive yourself. This is true for many caching-like features, but also play an important role in event handling, where a subscriber shall not be kept alive by being subscribed to an event.

A small example: A timer event which refreshes some data. Any number of objects can subscribe on the timer in order to get notified, but the bare fact that they subscribed on the timer should not keep them alive. So the timer should have weak references to the objects.

Lucero
Using weak references (notably through `WeakHashMap`) can go horribly wrong. For instance, the VM may notice that it can clear the references almost immediately after they are created. This does happen, although usually after the application has been running a while.
Tom Hawtin - tackline
tackline - I'm not sure about 'horribly wrong' - certainly if they are used when they shouldn't be, then the app will fail - but that's a design issue, not anything to do with weak references per-se
Kevin Day
I'm not saying that you should reference an object only throug a weak reference, but rather that you can use those to keep references to objects where it isn't your responsibility to keep them alive. Usually, those objects will be part of a strongly referenced object graph, but when that graph goes out of scope, the weak reference shouldn't hinder the GC to collect those. It's always the same: use the appropriate tools where they fit.
Lucero
+6  A: 

Weak references are all about garbage collection. A standard object will not "disappear" until all references to it are severed, this means all the references your various objects have to it have to be removed before garbage collection will consider it garbage.

With a weak reference just because your object is referenced by other objects doesn't necessarily mean it's not garbage. It can still get picked up by GC and get removed from memory.

An example: If I have a bunch of Foo objects in my application I might want to use a Set to keep a central record of all the Foo's I have around. But, when other parts of my application remove a Foo object by deleting all references to it, I don't want the remaining reference my Set holds to that object to keep it from being garbage collected! Really I just want it to disappear from my set. This is where you'd use something like a Weak Set (Java has a WeakHashMap) instead, which uses weak references to its members instead of "strong" references.

If your objects aren't being garbage collected when you want them to then you've made an error in your book keeping, something's still holding a reference that you forgot to remove. Using weak references can ease the pain of such book keeping, since you don't have to worry about them keeping an object "alive" and un-garbage-collected, but you don't have to use them.

Matt Baker
+4  A: 

Can any damage be done by over-enthusiastic use of WRs?

Yes it can.

One concern is that weak references make your code more complicated and potentially error prone. Any code that uses a weak reference needs to deal with the possibility that the reference has been broken each time it uses it. If you over-use weak references you end up writing lots of extra code. (You can mitigate this by hiding each weak reference behind a method that takes care of the checking, and re-creates the discarded object on demand. But this may not necessarily be as simple as that; e.g. if the re-creation process involves network access, you need to cope with the possibility of re-creation failure.)

A second concern is that there are runtime overheads with using weak references. The obvious costs are those of creating weak references and calling get on them. A less obvious cost is that significant extra work needs to be done each time the GC runs.

A final concern is that if you use a weak references for something that your application is highly likely to need in the future, you may incur the cost of repeatedly recreating it. If this cost is high (in terms of CPU time, IO bandwidth, network traffic, whatever) your application may perform badly as a result. You may be better off giving the JVM more memory and not using weak references at all.

Off course, this does not mean you should avoid using weak references entirely. Just that you need to think carefully. And probably you should first run a memory profiler on your application to figure out where your memory usage problems stem from.

Stephen C