tags:

views:

2181

answers:

6

Hi all,

I am doing some calculations that require a large array to be initialized. The maximum size of the array determines the maximum size of the problem I can solve.

Is there a way to programmatically determine how much memory is available for say, the biggest array of bytes possible?

Thanks

+9  A: 

Well, relying on a single huge array has a range of associated issues - memory fragmentation, contiguous blocks, the limit on the maximum object size, etc. If you need a lot of data, I would recommend creating a class that simulates a large array using lots of smaller (but still large) arrays, each of fixed size - i.e. the indexer divides to to find the appropriate array, then uses % to get the offset inside that array.

You might also want to ensure you are on a 64-bit OS, with lots of memory. This will give you the maximum available head-room.

Depending on the scenario, more sophisticated algorithms such as sparse arrays, eta-vectors, etc might be of use to maximise what you can do. You might be amazed what people could do years ago with limited memory, and just a tape spinning backwards and forwards...

Marc Gravell
A single huge array held for the life of the application wouldn't have any of the problems you mentioned. And sparse arrays don't make any sense if you plan on populating the entre array anyways.
Jonathan Allen
It will *absolutely* suffer from the maximum .NET object size and array size issues. And it could still easily suffer from the maximum contiguous block of space.
Marc Gravell
+2  A: 

The short answer is "no". There are two top level resources that would need to be queried

  1. The largest block of unallocated virtual address space available to the process
  2. The amount of available page file space.

As Marc Gravell correctly stated, you will have your best success on a 64-bit platform. Here, each process has a huge virtual address space. This will effectively solve your first problem. You should also make sure the page file is large.

But, there is a better way that is limited only by the free space on your disk: memory mapped files. You can create a large mapping (say 512MB) into an arbitrarily large file and move it as you process your data. Note, be sure to open it for exclusive access.

Foredecker
+1  A: 

If you need really, really big arrays, don't use the CLR. Mono supports 64-bit array indexes, allowing you to fully take advantage of your memory resources.

Jonathan Allen
+1  A: 

In order to ensure you have enough free memory you could use a MemoryFailPoint. If the memory cannot be allocated, then an InsufficientMemoryException will be generated, which you can catch and deal with in an appropriate way.

David Kirkland
A: 

I suppose binary search could be a way to go. First, start by allocating 1 byte, if that succeeds, free that byte (set the object to null) and double it to 2 bytes. Go on until you can't allocate any more, and you have found a limit that you can consider "the lower limit".

The correct number of bytes that can be allocated (let's call it x) is within the interval lower < x < 2 * lower. Continue searching this interval using binary search.

Alex
A: 

The biggest array one can allocate in a 64 bit .NET program is 2GB. (Another reference.)

You could find out how many available bytes there are easily enough:


Using pc As New System.Diagnostics.PerformanceCounter("Memory", "Available Bytes")
  FreeBytes = pc.NextValue();
End Using

Given that information you should be able to make your decision.