views:

136

answers:

4

Hi guys,

I'm going to use a SoftReference-based cache (a pretty simple thing by itself). However, I've came across a problem when writing a test for it.

The objective of the test is to check if the cache does request the previously cached object from the server again after the memory cleanup occurs.

Here I find the problem how to make system to release soft referenced objects. Calling System.gc() is not enough because soft references will not be released until the memory is low. I'm running this unit test on the PC so the memory budget for the VM could be pretty large.

================== Added later ==============================

Thank you all who took care to answer!

After considering all pro's and contra's I've decided to go the brute force way as advised by nanda and jarnbjo. It appeared, however, that JVM is not that dumb - it won't even attempt garbage collecting if you ask for a block which alone is bigger than VM's memory budget. So I've modified the code like this:

    /* Force releasing SoftReferences */
    try {
        final List<long[]> memhog = new LinkedList<long[]>();
        while(true) {
            memhog.add(new long[102400]);
        }
    }
    catch(final OutOfMemoryError e) {
        /* At this point all SoftReferences have been released - GUARANTEED. */
    }

    /* continue the test here */
A: 

You could explicitly set the soft reference to null in your test, and as such simulate that the soft reference has been released.

This avoids any complicated test setup that is memory and garbage collection dependend.

Ruben
A: 
  1. Set the parameter -Xmx to a very small value.
  2. Prepare your soft reference
  3. Create as many object as possible. Ask for the object everytime until it asked the object from server again.

This is my small test. Modify as your need.

@Test
public void testSoftReference() throws Exception {
    Set<Object[]> s = new HashSet<Object[]>();

    SoftReference<Object> sr = new SoftReference<Object>(new Object());

    int i = 0;

    while (true) {
        try {
            s.add(new Object[1000]);
        } catch (OutOfMemoryError e) {
            // ignore
        }
        if (sr.get() == null) {
            System.out.println("Soft reference is cleared. Success!");
            break;
        }
        i++;
        System.out.println("Soft reference is not yet cleared. Iteration " + i);
  }
}
nanda
Is it possible to set -Xmx from within the test? I don't have any configuration server for running tests...
JBM
In theory -Xmx is not really needed. I suggested that to make your test run not for a long time.
nanda
A: 

Instead of a long running loop (as suggested by nanda), it's probably faster and easier to simply create a huge primitive array to allocate more memory than available to the VM, then catch and ignore the OutOfMemoryError:

    try {
        long[] foo = new long[Integer.MAX_VALUE];
    }
    catch(OutOfMemoryError e) {
        // ignore
    }

This will clear all weak and soft references, unless your VM has more than 16GB heap available.

jarnbjo
that doesn't work : Integer.MAX_VALUE is outside of possible array size on VM.
nanda
+3  A: 

This piece of code forces the JVM to flush all SoftReferences. And it's very fast to do.

It's working better than the Integer.MAX_VALUE approach, since here the JVM really tries to allocate that much memory.

try {
    Object[] ignored = new Object[(int) Runtime.getRuntime().maxMemory()];
} catch (Throwable e) {
    // Ignore OME
}

I now use this bit of code everywhere I need to unit test code using SoftReferences.

Update: This approach will indeed work only with less than 2G of max memory.

Also, one need to be very careful with SoftReferences. It's so easy to keep a hard reference by mistake that will negate the effect of SoftReferences.

Here is a simple test that shows it working every time on OSX. Would be interested in knowing if JVM's behavior is the same on Linux and Windows.


for (int i = 0; i < 1000; i++) {
    SoftReference<Object> softReference = new SoftReferencelt<Object>(new Object());
    if (null == softReference.get()) {
        throw new IllegalStateException("Reference should NOT be null");
    }

    try {
        Object[] ignored = new Object[(int) Runtime.getRuntime().maxMemory()];
    } catch (Throwable e) {
        // Ignore OME
    }

    if (null != softReference.get()) {
        throw new IllegalStateException("Reference should be null");
    }

    System.out.println("It worked!");
}
David Gageot
That's of course not fool proof either if Runtime.getRuntime().maxMemory() returns a value outside the positive range of an int. E.g. if maxMemory() returns a value between 2GB and 4GB, the cast will result in an attempt to create an array with a negative size.
jarnbjo
This doesn't work - at least because maxMemory() returns amount of **bytes** but you're trying to allocate **Objects**. I've tried it in my test; thought the OOME is thrown, the garbage collection does not occur prior to that. But this approach will be more efficient than loop-allocating small arrays if you allocate bytes instead of Objects and calculate the proper size given the maxMemory value.
JBM
Hmm, after trying this for few times I get inconsistent results - sometimes it works and sometimes it doesn't... Maybe http://stackoverflow.com/questions/457229/how-to-cause-soft-references-to-be-cleared-in-java/458224#458224 could be the answer why. If it's true then it's better to use the loop-based allocation.
JBM