views:

82

answers:

5

Hey I'm testing out Runtime.freeMemory() with this code:

 Runtime runtime = Runtime.getRuntime();
 long freeMemory = runtime.freeMemory();

 // This is some arbitary factor that should work <= 1
 double factor = 0.8;

 int size = (int) (factor * freeMemory);
 byte[] testArray = new byte[size];

I'm creating a byte array of size close to the freeMemory value. For some reason, especially when I limit the memory of the program to around 8MB, the code throws an OutOfMemory exception for any factor > 0.55. This really doesnt make sense, surely freeMemory means freeMemory, I expect it to be a little out, but not double what actually is free.

Any suggestions on whats going on? Thanks

(Note in my tests im limiting the memory available to the program to 8MB or 16MB, using -Xmx8M etc.)

A: 

Try garbage collecting before creating the test array with runtime.gc().

If you aren't creating a brand new JVM each time you could be getting bit by having different start states.


Mine works for values > 1. 1.25 for example. Then I get a 'heap space' exception.


Here, perhaps you want 'maxMemory()' instead.

public class Mem2 {
    public static void main(String[] args) {


        Runtime runtime = Runtime.getRuntime();
        runtime.gc();

        long freeMemory = runtime.freeMemory();

        // This is some arbitary factor that should work <= 1
        double factor = 1.29;

        int size = (int) (factor * freeMemory);
        System.out.println("      freememory is " + freeMemory);
        System.out.println("            size is " + size);
        System.out.println("the total memory is " + runtime.totalMemory());
        System.out.println("  the max memory is " + runtime.maxMemory());
        byte[] testArray = new byte[size];
    }
}

Output:

      freememory is 84466864
            size is 108962254
the total memory is 85000192
  the max memory is 129957888

Process finished with exit code 0

So there seems to be about 20M I can't account for.

I think totalMemory() is the amount of memory currently allocated by the JVM, freeMemory() is how much of this has been uses, and maxMemory() is the hard limit.


You can see the interplay between totalMemory() and freememory() with this variant on your code.

public class Mem3 {

    public static void main(String[] args) {


        Runtime runtime = Runtime.getRuntime();

        for (int i = 0; true; i++) {
            runtime.gc();
            int size = i * 10000000;
            System.out.println("                 i is " + i);
            System.out.println("              size is " + size);
            System.out.println("b       freememory is " + runtime.freeMemory());
            System.out.println("b the total memory is " + runtime.totalMemory());
            System.out.println("b   the max memory is " + runtime.maxMemory());
            byte[] testArray = new byte[size];
            System.out.println("                array " + testArray.length);
            System.out.println("a       freememory is " + runtime.freeMemory());
            System.out.println("a the total memory is " + runtime.totalMemory());
            System.out.println("a   the max memory is " + runtime.maxMemory());
            System.out.println(" ");
        }

    }
}

If you run that and look at the before and after values, you can see what's going on. Note what happens between interations 6 and 7:

                 i is 6
              size is 60000000
b       freememory is 84300496
b the total memory is 85000192
b   the max memory is 129957888
                array 60000000
a       freememory is 24300472
a the total memory is 85000192
a   the max memory is 129957888

                 i is 7
              size is 70000000
b       freememory is 84300496
b the total memory is 85000192
b   the max memory is 129957888
                array 70000000
a       freememory is 59258168
a the total memory is 129957888
a   the max memory is 129957888

We see in 6 that after the allocation of 60M, there are about 24M remaining. In 7, however, we've exceeded a threshold. More memory has been allocated (note totalMemory) and freeMemory is now just under 60M.

Tony Ennis
I also get a factor of around 1.3 when I test it normally, with the default memory allocated being 128M, I'm getting problems when I test it at around 8MB, or 16MB, using -Xmx8M etc. It seems quite random, at low memory levels it wont allow me to make an array as large as free memory, but at higher memory levels you can go over it..
Ricky Jones
The reason you can go over the free memory is that your allocation would be triggering a garbage collection which would substantially increase the amount of memory available. This would be particularly noticable with more memory as the GC would leave more junk lying around for longer.
CurtainDog
+1  A: 

Actually, your free memory divided to two generations: http://java.sun.com/docs/hotspot/gc1.4.2/ one is the "Young" generation, and one is the "tenured" generation. If you run with -verbose:gc -XX:+PrintGCDetails VM settings, you can see how much each generation takes. I found out that I can fill my tenured generation completely, but not more than that.

Hila
That's right: the memory model is complex and the exact behaviour will depend on how the JVM is configured.
CurtainDog
A: 

The best way of getting a grip on this would be to run your code through a profiler that is capable of showing you how the memory in your system is being allocated over time. If you're using Eclipse make sure you've got TPTP installed, I'm sure the other big IDEs have the same feature somewhere.

CurtainDog
A: 

I think that this is to do with the sizes of the heap's partitions. When the JVM starts, it is going to split the available heap memory into a number of "spaces"; e.g. there is a space for newly created objects, a space for tenured objects, an "eden" space for objects that survive the first GC cycle, and so on. The actual sizes of the spaces are tunable (via JVM options), but the chances are that the "new objects" space is considerably less than 8Mb.

When you then try to allocate your array containing 55% of the reported free memory, the memory manager needs to find that amount of contiguous memory in the "new objects" space. If you are getting an OOME, that is because the actual partitioning is such that the required amount of contiguous memory is not available ... even after the GC has run.

Basically, you are trying to run the JVM with too small a heap for what you are trying to do. As a general principle, it is a bad idea to be miserly with Java's heap size. Your application will run faster (and with less problems) if you give it plenty of memory.

Stephen C
A: 

This answer is based on a comment by spong, in the main quesiton.

If rather than trying to create the array in one chuck, you split its creation into many smaller arrays, you can fill the free memory to a factor of around 0.98.

     Runtime runtime = Runtime.getRuntime();
     long freeMemory = runtime.freeMemory();

     // This is some arbitary factor that should work <= 1
     double factor = 0.8;

     int size = (int) (factor * freeMemory);


     int slice = 100;
     byte[][] testArrays = new byte[slice][1];
     for (int i = 1; i <= slice; i++) {
            testArrays[i-1] = new byte[size / slice];
            System.out.println("Allocated: " + i * size / slice);
     }

     System.out.println("Created! "+testArrays.length);
Ricky Jones