views:

75

answers:

2

Hello

I have a java program that realizes a lot of mathematical operations and handle with a lot of object instances. But the most interesting I noticed is that in different computers, the memory comsuption is drastically different.

On a Intel Core 2 Duo (2Ghz) with 2Gb of ram and running WinXP 32bits- my program uses around 185mb of memory. The JVM properties are -Xms768m -Xmx1300m (If I set more than 1300m, I get an out of memory exception at runtime).

On a Turion X2 (2.1Ghz) with 3Gb of ram and running WinXP 32bits - my program uses around 380mb of memory. The JVM properties are -Xms768m -Xmx1600m (1600m is the most I could set that my computer run the program).

Do you know why such a big difference?

+4  A: 

I imagine the garbage collector is more lenient with more memory to play with.

Gary
That's right; garbage collection takes time, and the garbage collector won't act if it doesn't appear to be necessary.
erickson
See [here](http://www.codinghorror.com/blog/2006/09/why-does-vista-use-all-my-memory.html) for why this makes sense.
BlueRaja - Danny Pflughoeft
I don't get it. So, the more memory is allocated, the bigger is the consuption?
marionmaiden
Ps.: I'm forcing GC during execution, but I'm aware it may not run if it's not necessary
marionmaiden
yea, I think that's not guaranteed. http://www.jchq.net/certkey/0301certkey.htmthere's a line, "you cannot force garbage collection, only suggest it"
Gary
basically, there's no reason to keep memory around unused. You should always use it for cache or something. People always flip out when they see high memory usage, but it's usually more efficient to use more memory than not. But even so, java's considered to be a bloated hog.
Gary
The problem is that I'm getting a out of memory expection when I'm working with more data. So, I need to alocate a good quantity of memory, but I need to spend this as low as I can.
marionmaiden
have you considered using a persistent disk-store with map/reduce or a database or something instead to deal with your massive amounts of data? Otherwise, you should do some optimization, go to 64-bit and add RAM, use C for more control over memory and less overhead, flyweight pattern, etc.
Gary
+1  A: 

To do a proper comparison you should:

  • set the exact same VM parameters.
  • state if the VM is the same.
  • run the program with the exact same input parameters

Most likely it is as Gary suggests, just the VM jumping around or choosing to GC at some different time than when you are looking. If the consumption is 'real' it could be some difference in how hotspot (which I guess looks at processor cache sizes) chooses to pack your java objects member fields into a contiguous structure -- perhaps it is adding a few bytes of pad (to some object you have alot of) on on platform and not on the other.

Justin