views:

431

answers:

2

I have a 6.5GB Hprof file that was dumped by a 64-bit JVM using the -XX:-HeapDumpOnOutOfMemoryError option. I have it sitting on a 16GB 64-bit machine, and am trying to get it into jhat, but it keeps running out of memory. I have tried passing in jvm args for minimum settings, but it rejects any minimum, and seems to run out of memory before hitting the maximum.

It seems kind of silly that a jvm running out of memory dumps a heap so large that it can't be loaded on a box with twice as much ram. Are there any ways of getting this running, or possibly amortizing the analysis?

A: 

What flags are you passing to jhat? Make sure that you're in 64-bit mode and you're setting the heap size large enough.

Kevin
+4  A: 

I would take a look at the eclipse memory analyzer. This tool is great, and I have looked at several Gig heaps w/ this tool. The nice thing about the tool is it creates indexes on the dump so it is not all in memory at once.

broschb
This worked. I actually tried it before with some smaller heap dumps and it really didn't give me any helpful info, but once i got it in with the HeapDumpOnOutOfMemoryError hprof it finally pointed out the exact problem.
liam