views:

1234

answers:

5

I know there is a WeakHashMap in java.util, but since it uses WeakReferences for everything, which is only referenced by this Map, referenced objects will get lost on the next GC cycle. So it's nearly useless if you want to cache random data, which is very likely to be requested again without being Hard-linked the rest of the time. The best solution would be a map, which uses SoftReferences instead, but i didn't find one in the Java RT Package.

+3  A: 

Not that I know of (Nov. 2008), but you kind find some implementation of SoftHashMap on the net.

Like this one: SoftHashMap or this one


Edit (Nov. 2009)
As Matthias mentions in the comments, the Google Collection MapMaker does use SoftReferences:

A ConcurrentMap builder, providing any combination of these features:

  • soft or weak keys,
  • soft or weak values,
  • timed expiration, and
  • on-demand computation of values.

As mentioned in this thread, another JSR166y candidate:

jsr166y.ConcurrentReferenceHashMap

It provides an alternative concurrent reference map to the Google implementation (which relies on a background thread to evict entries)

VonC
+1  A: 

There is an example implementation in 98 issue of java specialists newsletter

jb
+1  A: 

If you want to implement a cache softreferences are definetly a better idea than weak references, but it puts your entire cache removal policy in the hands of the garbage collector. which is probably not what you want.

If cache removal policy is important your are going to need to do it on your own most likely using regular references. However you are going to have to decide when to eject items and which to eject. If you only want to lose things when you are running out of heap space you can query available heap space via:

Runtime.getRuntime().getFreeMemory();

Then once free memory drops below a certain amount you can start either dropping items. Or you could just implement a max size for the cache and use that to decide when to drop things.

here's an LRU cache i designed with O(1) insertion, deletion and lookup time, that has a configurable max number of elements. If you want a cache this is going to be a better solution imho than a SoftHashMap.

The softreferences are a great way to create a growable cache. So the ideal solution would be to use a SoftHashMap along with a regular fixed size cache. have all inserts into the cache go into both the fixed cache and the soft hash map then to reference something just see if its in the soft hashmap (and update the reference time in the cache). this way all your most important items (according to your chosen policy LRU, MFU,...) will never be removed because they are hard referenced in the cache but you will also hold on to more things (with no policy control) as long as there is sufficient memory.

luke
+8  A: 

I am familiar with two libraries that offer a SoftHashMap implementation:

  1. Apache Commons: org.apache.commons.collections.map.ReferenceMap

  2. Google Collections: com.google.common.collect.ReferenceMap

Gili
ReferenceMap has been scrapped from Google Collections. Use MapMaker now.
Matthias
A: 

Have you considered using an LRUMap instead of a soft HashMap? You get more control over what gets stored (or at least, how much).

Joel