tags:

views:

102

answers:

3

what is the optimal freeheap to totalheap ratio? At what values of this ratio should I consider increasing the heap size/ decreasing the heap size?

+2  A: 

The ideal momentary ratio is 1. Ideally, your JVM would consume exactly the memory it required, no more and no less. That's a very hard target to reach ;)

The problem (as TNilsson points out) is that your application's memory requirements change over time as it does work, so you want it to have enough space not to cause constant collection/compaction more often than you can tolerate, and you want it to consume little enough space that you don't have to buy more RAM.

DDaviesBrackett
Having heapsize=live set-size, or close to it, would lead to constant garbage collections and compactions. Trying to achieve 1 would lead to terrible performance.
Tnilsson
that's a really good point. I'll edit to account for it.
DDaviesBrackett
+1  A: 

This probably depends on the rate that you allocate new objects. Garbage collection involves a lot of work tracing references from live objects. I have just been dealing with a situation where there was plenty of free memory (say 500 MB used, 500 MB free) but so much array allocation was happening that the JVM would spend 95% of its time doing GC. So don't forget about the runtime memory behaviour.

All those performance tuning articles that say something like "object allocation is really fast in Java" without mentioning that some allocations cause 1 second of GC time make me laugh.

David Plumpton
why was garbage collection being done when there was so much free memory?
Because new garbage is being constantly allocated and after a few seconds/minutes you run out. It's okay if it's minutes. It's bad if you run out in seconds. It's terrible if it's milliseconds.
David Plumpton
Heh. Yeah, object allocation is really fast. Making sure there is room for the object allocation is what can take time.
Tnilsson
+1  A: 

There is no single easy answer, let me give you two examples:

Example 1 - Your program allocates 100M worth of memory at startup, and then does not allocate any memory what so ever for the rest of its run.

In this case, you clearly want to have a heap size of 100M (Well, perhaps 101 or something, but you get the point...) to avoid wasting space.

Example 2 - Your program allocates 10M of memory per second. None of the data is persisted longer than 1 second. (e.g. you are doing a calculation that requires a lot of temporary data, and will return a single integer when you are done...)

Knowing the exact numbers is perhaps not so realistic, but it's an example, so bare with me... Since you have 10M of "live" data, you will have to have at least 10M heap. Other than that, you need to check how your garbage collector works. Simplified, the time a GC takes to complete is O(live set), that is, the amount of "dead" data does not really enter into it. With a constant live set size, your GC time is constant no matter your heap size. This leads to larger heap -> Better throughput.

(Now, to really mess things up you add stuff like compaction of the heap and the image becomes even less clear...)

Conclusion It's a simplified version of the matter, but the short answer is - It depends.

Tnilsson