views:

171

answers:

1

Hi, I have a curious problem, I need to analyze a Java heap dump (from an IBM JRE) which has 1.5GB in size, the problem is that while analyzing the dump (I've tried HeapAnalyzer and the IBM Memory Analyzer 0.5) the tools runs out of memory I can't really analyze the dump. I have 3GB of RAM in my machine, but seems like it's not enough to analyze the 1.5 GB dump,

My question is, do you know a specific tool for heap dump analysis (supporting IBM JRE dumps) that I could run with the amount of memory I have?

Thanks.

+3  A: 

Try the SAP memory analyzer tool, which also has an eclipse plugin. This tool creates index files on disk as it processes the dump file and requires much less memory than your other options. I'm pretty sure it supports the newer IBM JRE's. That being said - with a 1.5 GB dump file, you might have no other option but to run a 64-bit JVM to analyze this file - I usually estimate that a heap dump file of size n takes 5*n memory to open using standard tools, and 3*n memory to open using MAT, but your milage will vary depending on what the dump actually contains.

Amir Afghani
Agree with MAT. I have been able to open 1.5Gb heap dumps but it was using a 64-bit JVM on a machine with 4GB memory.
Mark
I was able to run MAT in a Linux box, this one is a 32-bits box, but looks like Linux is actually able to handle processes of 3GB (I used -Xmx3000m), I finally could parse the dump and now I'm able to analyze from Windows =)
Abel Morelos