views:

147

answers:

4

I have an application which predictably generates out-of-memory errors on very, very (very) large datasets - we're trying to fix the problem by optimizing the app's memory management, but the very, very large datasets in question require so long to run (days) that it's difficult to iterate through testing cycles and find the problem empirically.

Leaving aside for a moment the question of application performance - that's next on the task list after correct application behavior:

Is there an easy way to restrict the amount of memory an application has available when running in debug mode in Visual Studio, so as to force OutOfMemory errors which naturally occur only on very large datasets to instead occur on a smaller dataset?

+2  A: 

Simply Allocate a large chunk yourself at program startup - before you do anything else.

To keep around 500MB free (for a 32 bit process):

byte[] OutOfMemory = new Byte[int.MaxValue - ((1024 ^ 2) * 500)];
... do you mean 2 ^ 1024?
Skeolan
@Skeolan - No. Why don't you type that into a calculator and let me know what you get...
A: 

One trick that should work would be to allocate a larger buffer right when the program starts. As long as you keep a reference to it so that it's not GCed, that will get you want you want.

R Samuel Klatchko
Good Point about keeping a reference!
+2  A: 

Have you tried simply allocating a large amount of memory to begin with, and keeping it for the duration of the program execution?

This would reduce the available memory for the rest of the application.

Lasse V. Karlsen
A: 

I'd get a memory profiler and try and re-create the issue artificially before it bites you in production. ANTS memory profiler is pretty good for that:

http://www.red-gate.com/products/ants_memory_profiler/index.htm

It's expensive but the trial would probably do for that issue.

Chris Smith