views:

2176

answers:

2

What is the maximum memory the garbage collector can allocate for a .NET process? When i compile to x64, Process.GetCurrentProcess.MaxWorkingSet returns about 1,4GB, but when i compile to AnyCPU (x64) the same number is returned. For x64 it should be more like the "Limit" value that is displayed in the Task Manager. How can i get the correct number that will cause OutOfMemory-Exceptions when exceeded in all cases?

Some examples what the method should return:

1) Machine Configuration: x64-Windows, 4GB physical memory, 4GB page file
-As 64-Bit process: 8GB
-As 32-Bit process: 1.4GB

2) Machine Configuration: x64-Windows, 1GB physical memory, 2GB page file
-As 64-Bit process: 3GB
-As 32-Bit process: 1.4GB

3) Machine Configuration: x32-Windows, 4GB physical memory, 4GB page file
-As 64-Bit process: Won't happen
-As 32-Bit process: 1.4GB

4) Machine Configuration: x32-Windows, 512MB physical memory, 512MB page file
-As 64-Bit process: Won't happen
-As 32-Bit process: 1.0GB

A: 

Doesn't it depend on how much RAM you have?

In theory, a x64 process can allocate EB's (etabytes?) of RAM, I think - ie, a LOT. But if you do, your machine should start paging like crazy and generally die.

It was different in 32 bit mode, as you couldn't allocate more than 1GB of RAM in ANY process in windows (yes, there are ways around it, but it's not pretty). In practice, this ment about 7-800meg per .NET process, as .NET reserved some space.

Either way, in 32 bit, the most you can use is 3GB - the OS reserves 1GB of virtual space for itself.

In 64 bit, it should be 2^64, which is a big number, but http://en.wikipedia.org/wiki/X86-64 says it's 256TB of virtual space, and 1TB of REAL RAM. Either way, it's a lot more than you are likely to have in your machine, so it's going to hit the page file.

With a 64-bit OS and a 64-bit runtime, .NET 2.0-based applications can now utilize 500 times more memory for data such as server-based caches.

This has some good info, too http://www.theserverside.net/tt/articles/showarticle.tss?id=NET2BMNov64Bit

BTW, if you are on a x64 machine (ie, x64 machine + x64 OS), compiling for AnyCPU and x64 does the same thing - it runs in x64 mode. The only difference is if you use AnyCPU vrs x86:

  • x64 OS/.NET, AnyCpu: x64 app
  • x64 OS/.NET, x64: x64 app
  • x64 OS/.NET, x32: x32 app (x64 .NET framework as BOTH x32 and x64 versions of the Fx installed)

  • x32 OS/NET, AnyCPU: x32 app

  • x32 OS/.NET, x64: CRASH AND BURN BABY! (actually, it just dies gracefully)
  • x32 OS/.NET, x32: x32 app.
Nic Wise
But how can i get the real memory limit for a machine?
Rauhotz
+8  A: 

Windows can be configured to allocate more page file space on demand, or on request.
Job objects can prevent the consumption of more than a certain amount of memory.
Fragmentation of the heap and the generational nature of it (plus the need to put large stuff in the Large Object Heap)

All these mean that the hard limit is not much use in reality and means answering the question "how much memory could I theoretically allocate" is rather more complex than you think.

Since it is complex anyone asking that question is probably trying to do something wrong and should redirect their question to something more useful.

What are you trying to do that would appear to necessitate such a question?

"I just want to know when the current memory load of the process could get problematic so I can take actions like freeing some items of a custom cache."

Right. this is much more tractable a question.

Two solutions in order of complexity:

  1. Make your caches use WeakReferences
    • This means that things will be freed by the system almost magically for you but you will have little control over things like the replacement policy
    • this relies on the cached data being much bigger than the key and the overhead of a weak reference
  2. Register for notification of Garbage Collections
    • This lets you take control of freeing things up.
    • you rely on the system having the appropriate maximum size for the GC generations which may take a while to get to a steady state.

Points to note. Is it really less expensive to maintain this massive cache (going to disk by the sounds of it) than to recalculate/re-request the data.
If your cache exhibits poor locality between commonly/consecutively requested items then much effort will be spent paging data in and out. A smaller cache with an effective tuned relpacement policy stands a good chance of performing considerably better (and with much less impact on other running programs)

As an aside: In .Net, no variable sized object (strings, arrays) can be more than 2GB in size due to limitations of the core CLR structures for memory management. (and either solution above will benefit from this)

ShuggyCoUk
I just want to know when the current memory load of the process could get problematic so i can take actions like freeing some items of a custom cache. The value doesn't need to be 100% accurate, just a hint for my cache. I currently have the amount of free physical memory as a limit, but for 32-bit processes the barrier is lower.
Rauhotz
BTW: Good point with the dynamic page file size, is missed that, but for now, i would be satisfied with the current limit.
Rauhotz
Aha - now we're getting somewhere - answer editing now
ShuggyCoUk
Hmm, that question was some kind of follow-up question to this one: http://stackoverflow.com/questions/930198/does-weakreference-make-a-good-cache. I am going round in circles.
Rauhotz
ok but you're starting to get somewhere. It sounds strongly to me that you would benefit from the GC notification API then, if there are no gen 2 collections then you'll be fine
ShuggyCoUk