views:

16251

answers:

9

I've always been able to allocate 1400 megabytes for Java SE running on 32-bit Windows XP (Java 1.4, 1.5 and 1.6).

java -Xmx1400m ...

Today I tried the same option on a new Windows XP machine using Java 1.5_16 and 1.6.0_07 and got the error:

Error occurred during initialization of VM
Could not reserve enough space for object heap
Could not create the Java virtual machine.

Through trial and error it seems 1200 megabytes is the most I can allocate on this machine.

Any ideas why one machine would allow 1400 and another only 1200?

Edit: The machine has 4GB of RAM with about 3.5GB that Windows can recognize.

+4  A: 

The JVM needs contiguous memory and depending on what else is running, what was running before, and how windows has managed memory you may be able to get up to 1.4GB of contiguous memory. I think 64bit Windows will allow larger heaps.

James A. N. Stauffer
I think modern operating system emulate continuous memory for the. Since the 80486 the x86-architecture supports paging to make it easy to rearrange the physical memory.
Mnementh
Mnemeth: First, there is a specific API (AllocateUserPhysicalPages) in WINAPI for advanced tools like databases and VMs that are better off with managing their memory themselves with Windows out of the way. Second, paging is a 80386 protected mode feature, not 80486.
DrJokepu
A: 

sun's JDK/JRE needs a contiguous amount of memory if you allocate a huge block.

The OS and initial apps tend to allocate bits and pieces during loading which fragments the available RAM. If a contiguous block is NOT available, the SUN JDK cannot use it. JRockit from Bea(acquired by Oracle) can allocate memory from pieces.

anjanb
+2  A: 

I think it has more to do with how Windows is configured as hinted by this response: Java -Xmx Option

Some more testing: I was able to allocate 1300MB on an old Windows XP machine with only 768MB physical RAM (plus virtual memory). On my 2GB RAM machine I can only get 1220MB. On various other corporate machines (with older Windows XP) I was able to get 1400MB. The machine with a 1220MB limit is pretty new (just purchased from Dell), so maybe it has newer (and more bloated) Windows and DLLs (it's running Window XP Pro Version 2002 SP2).

Steve Kuo
It could affected by your Virtual Memory settings also.
skaffman
All the machines I test with have virtual memory at least twice that of the physical RAM.
Steve Kuo
note that you really never want to really use virtual memory with java,because GC performance will become very badThe amount of memory depends on which dll's have already been loaded and have fragmented the memory.
kohlerm
+3  A: 

Sun's JVM needs contiguous memory. So the maximal amount of available memory is dictated by memory fragmentation. Especially driver's dlls tend to fragment the memory, when loading into some predefined base address. So your hardware and its drivers determine how much memory you can get.

Two sources for this with statements from Sun engineers: forum blog

Maybe another JVM? Have you tried Harmony? I think they planned to allow non-continuous memory.

the.duckman
But I was able to allocate 1300MB on a machine with only 1GB of RAM (plus virtual memory). My 2GB RAM machine (also with virtual memory) can only allocate 1200MB.
Steve Kuo
+22  A: 

Keep in mind that Windows has virtual memory management and the JVM only needs memory that is contiguous in its address space. So, other programs running on the system shouldn't necessarily impact your heap size. What will get in your way are DLL's that get loaded in to your address space. Unfortunately optimizations in Windows that minimize the relocation of DLL's during linking make it more likely you'll have a fragmented address space. Things that are likely to cut in to your address space aside from the usual stuff include security software, CBT software, spyware and other forms of malware. Likely causes of the variances are different security patches, C runtime versions, etc. Device drivers and other kernel bits have their own address space (the other 2GB of the 4GB 32-bit space).

You could try going through your DLL bindings in your JVM process and look at trying to rebase your DLL's in to a more compact address space. Not fun, but if you are desperate...

Alternatively, you can just switch to 64-bit Windows and a 64-bit JVM. Despite what others have suggested, while it will chew up more RAM, you will have much more contiguous virtual address space, and allocating 2GB contiguously would be trivial.

Christopher Smith
Use Process Explorer to see where in memory dll's are being loaded. Often some updated driver will stick itself in the middle of your address space. Using the REBASE command you can easily push these out of the way. Keep in mind though, the dll is subject to get updated again and break things.
brianegge
I never accepted this as an answer and yet stackoverflow marked it as answered.
Steve Kuo
the bounty system works by auto-accepting the most upvoted answer at the expiration of the bounty
Steve Schnepp
+9  A: 

This has to do with contiguous memory.

Here's some info I found online for somebody asking that before, supposedly from a "VM god":

The reason we need a contiguous memory region for the heap is that we have a bunch of side data structures that are indexed by (scaled) offsets from the start of the heap. For example, we track object reference updates with a "card mark array" that has one byte for each 512 bytes of heap. When we store a reference in the heap we have to mark the corresponding byte in the card mark array. We right shift the destination address of the store and use that to index the card mark array. Fun addressing arithmetic games you can't do in Java that you get to (have to :-) play in C++.

Usually we don't have trouble getting modest contiguous regions (up to about 1.5GB on Windohs, up to about 3.8GB on Solaris. YMMV.). On Windohs, the problem is mostly that there are some libraries that get loaded before the JVM starts up that break up the address space. Using the /3GB switch won't rebase those libraries, so they are still a problem for us.

We know how to make chunked heaps, but there would be some overhead to using them. We have more requests for faster storage management than we do for larger heaps in the 32-bit JVM. If you really want large heaps, switch to the 64-bit JVM. We still need contiguous memory, but it's much easier to get in a 64-bit address space.

Uri
+5  A: 

According to this IBM document about the Java heap size (along with some hints about setting the right heap size) the limits for Windows are:

  • maximum possible heap size on 32-bit Java: 1.8 GB
  • recommended heap size limit on 32-bit Java: 1.5 GB (or 1.8 GB with /3GB option)

This doesn't help you getting a bigger Java heap, but now you know you can't go beyond these values.

MicSim
+1  A: 

Oracle JRockit, which can handle a non-contiguous heap, can have a Java heap size of 2.85 GB on Windows 2003/XP with the /3GB switch. It seems that fragmentation can have quite an impact on how large a Java heap can be.

Kire Haglin
A: 

First, using a page-file when you have 4 GB of RAM is useless. Windows can't access more than 4GB (actually, less because of memory holes) so the page file is not used.

Second, the address space is split in 2, half for kernel, half for user mode. If you need more RAM for your applications use the /3GB option in boot.ini (make sure java.exe is marked as "large address aware" (google for more info).

Third, I think you can't allocate the full 2 GB of address space because java wastes some memory internally (for threads, JIT compiler, VM initialization, etc). Use the /3GB switch for more.

The idea of a page file being useless with 4GB or RAM is wrong. Without a pagefile, the operating system cannot evict unused process data (stack space for unused services, etc) from physical RAM, thus lowering the amount of RAM available for real work. Having a pagefile frees up RAM.
Andrew Medico