views:

364

answers:

4

I thought that the maximum user space for a 64bit process was 8TB, but I did a little test and the maximum I could get is 10-11GB.

Note: I don't need that much memory in a process, I just want to understand why out of curiosity.

Here is my test program:

static void Main(string[] args)
{
    List<byte[]> list = new List<byte[]>();

    while (true)
    {
        Console.WriteLine("Press any key to allocate 1 more GB");
        Console.ReadKey(true);
        list.Add(new byte[1024 * 1024 * 1024]);

        Console.WriteLine("Memory size:");
        double memoryUsage = Process.GetCurrentProcess().PeakVirtualMemorySize64 / (double)(1024 * 1024 * 1024);
        Console.WriteLine(memoryUsage.ToString("0.00") + " GB");
        Console.WriteLine();
    }
}

EDIT:

Updated the test program to be more deterministic.

To accept an answer I would like to know how the real maximum allocated memory is calculated if 8TB is only theoretical.

+3  A: 

It's up to 8 TB, not 8 TB. You can potentially have up to 8 TB, but you need the matching RAM/swapfile.

joemoe
Likely filling up your physical RAM and a swap file before an `OutOfMemoryException` occurs.
Will Eddins
I wonder what happens if ram is completely full. How does `throw new OutOfMemoryException` get executed?
Earlz
@earlz - the memory to throw that exception had already been allocated by the runtime.
codekaizen
+1  A: 

Try allocating one chunk (as opposed to a list of 1MB chunks):

Dim p As IntPtr = System.Runtime.InteropServices.Marshal.AllocHGlobal(New System.IntPtr(24 * (1024 ^ 3)))

Edit - Given your comment, that you only have 4GB of physical RAM, you really have no business allocating > ~8GB and even that is pushing it.

Edit -

To accept an answer I would like to know how the real maximum allocated memory is calculated if 8TB is only theoretical.

The maximum amount of RAM you can allocate is probably equivalent to the (Page File Size - Size of Everything in RAM except that which cannot or will not be paged) + (Physical RAM Size - Size of everything that cannot or will not be paged i.e. that which is needed to keep your system going... kernel, drivers, .net stuff etc...)

Of course the page file can grow...

Sooner or later paging to/from disk becomes to much and your system slows to a crawl and becomes unusable.

Read Mark Russinovich's blog:

Yeah! 640KB ought to be enough for anybody
Terry Mahaffey
@Terry - Way overused... not even funny... ~ Seriously, if you're trying allocate over twice the amount of physical RAM on a Windows machine you better rethink your allocation strategy. Use memory mapped files etc..
A: 

I'm guessing its because you're using a List, which I believe has an internal limit.

See what you can get if you try something like creating your own old school list:

public class ListItem<T>
{
    public ListItem Parent;
    public T Value;

    public ListItem(ListItem<T> parent, T item)
    {
        this.Parent = parent;
        this.Value = item;
    }
}

I have written almost that exact code before (only my item was an int) and run it on a machine with 32 processors and 128 GB of ram, it always crapped out at the same size no matter what, and was always something related to Int32.MaxValue, hope that helps.

NickLarsen
A `List<T>` can contain up to `Int32.MaxValue / 2` items (since it uses an Int32 to index the items). But the OP reaches 10GB in chunks of 1MB, so he only allocates about 10000 chunks... which is far from the limit. Also, you don't need to reinvent the linked list, there is already a `LinkedList<T>` class in the BCL...
Thomas Levesque
Thanks for a reference to the LinkedList, I rarely use them, so I didnt know it was there.
NickLarsen
+3  A: 

It's your machine.

I have an x64 with 8GB RAM and 12GB pagefile, and I ran your program and it topped out at 16.23GB.

EPILOG: Then my Win7 install gradually slid into a coma as critical processes were apparently memory starved.

EDIT: If you want to understand how Windows allocates (i.e. reserves and commits) memory, read this: http://blogs.technet.com/markrussinovich/archive/2008/07/21/3092070.aspx and this: http://blogs.technet.com/markrussinovich/archive/2008/11/17/3155406.aspx

Since .Net relies on Windows to manage the memory it uses to build the GC heap, the mechanics of how Windows does this are reflected in how memory is allocated in .Net on a low level.

codekaizen
`critical processes were apparently memory starved.`Ha! I would think that *critical* processes would be *exactly* the sort of processes that Win7 would prefer not to starve :)
Seth
Well, when memory is gone, it's gone!
codekaizen