tags:

views:

1150

answers:

5

Hi,

What i usually do concerning the jvm heap size is setting the max value really high to avoid the infamous OutOfMemoryException.

However, this strategy (or lack of strategy) doesn't seem to be really smart. :-).

My question is how to choose the min and max values, and the difference between the two (should max-min be small or big?). For instance, from here:

if the initial heap is too small, the Java application startup becomes slow as the JVM is forced to perform garbage collection frequently until the heap has grown to a more reasonable size. For optimal startup performance you should set the initial heap size to the same as the maximum heap size.

thanks.

+1  A: 

The right answer is: there is no right answer each project is different and you will have to fine-tune your heap size configuration on a per-project basis. I would start small and gradually increase the heap size until your application is running as intended.

You are right, setting a huge max value is not a good idea.

Robert Greiner
and is there any relationship between the min and max size ? Or is the min heap size completely irrelevant ?
LB
The min is *kind of* irrelevant, certainly if you're ensuring that there is enough physical memory such that the JVM's demands for maximum heap size can always be accommodated. I'd say set the minimum size to the kind of heap size you expect in normal operation (to avoid it being resized repeatedly at startup), and the max should reflect a "buffer zone" above this.
Andrzej Doyle
I don't try and double-guess the OS/kernel. I have to assume they've got a good handle on memory allocation, personally never seen a detrimental problem with having the JVM resize itself and I quite literally ignore the -Xms parameter nearly altogether.
Xepoch
+3  A: 

You should enable GC logging and check to see where your OOM is ocurring.

-verbose:gc
-Xloggc:gc.log  
-XX:+PrintGCTimeStamps
-XX:+PrintGCDetails

You may be experiencing perm space limits, adjust via -XX:MaxPermSize=YYYm

Anyway to answer your question, I start with no minimums and set the maximum relatively high. I then graph the gc log and find out where my stead state is; visually choose an above-average size for the various generations. Read it like a financial chart, you'll want to see good spread in the new generations and a consistent growth and collection in the tenured generation. As mentioned also graph your perm space to make sure you're not constantly increasing.

GC tuning is an art, in no way a science.

Xepoch
"GC tuning is an art, in no way a science." Epic, I'm going to quote you...
LB
+1  A: 

If you are experiencing an OOME, I would actually start by increasing the max memory as much as you can, and see if that resolves the issue. Let your machine soak up the performance problem, first. If the problem persists, then you can look into performance diagnostics to identify bottlenecks and work on those areas where your app might be leaking or might be hogging the most memory.

Jeff Atwood has a nice article on CodingHorror that explains this attitude; the most cost-effective solution to a performance problem is to throw hardware (or in this case, increased memory resources) at the problem, before investing developer time in troubleshooting:

http://www.codinghorror.com/blog/archives/001198.html

RMorrisey
+1  A: 

Indeed, setting a huge max value blindly is not really a good idea (measure, don't guess) and this strategy will lead to very long "stop the world" major GCs which might not be desirable from a user experience point of view (always keep in mind that "the bigger the heap, the longer the major GC").

That said, there is no generic answer to your question, every application has different needs. Actually, I'd suggest to profile your application and tune the heap to find a good compromise between (major) GC frequency and (major) GC duration while minimizing the response time to the end user. I warmly suggest to read this great blog post (and all others) from Kirk Pepperdine for further details.

Just to answer the min and max value part, I always use the same values (for better startup performances and better reproducibility).

Pascal Thivent
The problem is that my application footprint depends on the input. So profiling is kind of difficult
LB
@LB in any given application (same codebase) the usage profile can be vastly different. I have to advise all my customers in post-go-live situation for JVM tuning as part of a greater profiling and optimization phase as it changes customer-to-customer, users-by-users.
Xepoch
Why, I don't get it? What does the user input looks like for 80% of the users?
Pascal Thivent
+4  A: 

My question is how to choose the min and max values, and the difference between the two (should max-min be small or big?)

Short answer: don't guess, profile your application.

jconsole can give you useful high-level data such as a feeling for the main resident set vs. the transient data that we normally allocate and garbage collect. What you'll see if you look at the memory tab of that display is usually something like a sawtooth. The lower corner of the sawteeth is about where I would normally set the heap minimum whereas I would use the peak or slope of the sawteeth to experiment with a heap maximum. If your teeth are very steep, you might consider a big heap just to delay the garbage collection. However, if they aren't, you could try a smaller heap maximum to see if that might leave more resources for other processes on your machine (for example).

You should also consider the server VM as that will cause different garbage collection behavior.

All that said, you should also use a more detailed tool such as jvisualvm to profile the memory usage of your process. It's possible that you have a memory leak or greedy allocator that you could tune or eliminate. That would completely change your heap needs.

Bob Cross
Thanks for all the references
LB
@LB, no problem. I literally use those tools every day. Quoting myself: "They don't suck." ;-)
Bob Cross