I have an application which predictably generates out-of-memory errors on very, very (very) large datasets - we're trying to fix the problem by optimizing the app's memory management, but the very, very large datasets in question require so long to run (days) that it's difficult to iterate through testing cycles and find the problem empirically.
Leaving aside for a moment the question of application performance - that's next on the task list after correct application behavior:
Is there an easy way to restrict the amount of memory an application has available when running in debug mode in Visual Studio, so as to force OutOfMemory errors which naturally occur only on very large datasets to instead occur on a smaller dataset?