views:

133

answers:

5

I am developing a large program which uses a lot of memory. The program is quite experimental and I add and remove big chunks of code all the time. Sometimes I will add a routine that is rather too memory hungry and the HDD drive will start thrashing and the program (and the whole system) will slow to a snails pace. It can easily take 5 mins to shut it down!

What I would like is a mechanism for avoiding this scenario. Either a run time procedure or even something to be done before running the program, which can say something like "If you run this program there is a risk of HDD thrashing - aborting now to avoid slowing to a snails pace".

Any ideas?

EDIT: Forgot to mention, my program uses multiple threads.

A: 

I'd rather determine reasonable minimum requirements for the computer your program is supposed to run on, and during installation either warn the user if there's not enough memory available, or refuse to install.

Telling him each time he's starting the program is nonsensical.

fvu
This is not a finished product. I am developing it. I need this warning for my own use on my machine.
Mick
@Mick: in that case, if you're testing a particularly hungry piece of code, test it on another box so you don't mess up your dev box.
Randolpho
Virtual machines, and process working size limits, are both very useful, on a single box.
Warren P
+3  A: 

Windows XP is terrible when there are multiple threads or processes accessing the disk at the same time. This is effectively what you experience when your application begins to swap, as the OS is writing out some pages while reading in others. Windows XP (and Server 2003 for that matter) is utterly trash for this. This is a real shame, as it means that swapping is almost synonymous with thrashing on these systems.

Your options:

  • Microsoft fixed this problem in Vista and Server 2008. So stop using a 9 year old OS. :)
  • Use unbuffered I/O to read/write data to a file, and implement your own paging inside your application. Implementing your own "swap" like this enables you to avoid thrashing.

See here many more details of this problem: http://stackoverflow.com/questions/9191/how-to-obtain-good-concurrent-read-performance-from-disk

pauldoo
+2  A: 

I'm not familiar with Windows programming, but under Unix you can limit the amount of memory that a program can use with setrlimit(). Maybe there is something similar. The goal is to get the program to abort once it uses to much memory, rather than thrashing. The limit would be a bit less than the total physical memory on the machine. I would guess somewhere between 75% and 90%, but some experimentation would be necessary to find the optimal setting.

KeithB
@keithb: that would be a great solution if there was an windows equivalent...
Mick
there is a way (I decided to put it in as an answer).
Warren P
+1  A: 

Chances are your program could use some memory management. While there are a few programs that do need to hold everything in memory at once, odds are good that with a little bit of foresight you might be able to rework your program to reuse or discard a lot of the memory you need.

Your program will run much faster too. If you are using that much memory, then basically all of your built-in first and second level caches are likely overflowing, meaning the CPU is mostly waiting on memory loads instead of processing your code's instructions.

Edwin Buck
The program does indeed have to hold considerable amounts of memory on-chip. It manages a large tree structure and jumps around it unpredictably.
Mick
You might want to keep the processing on-disk then and use b-trees. The execution of such programs is only going to put the memory on disk anyway, and b-trees are specially optimized to exploit the kinds of performance characteristics that rotating disks offer. Set your node size to fit within a disk block size, and odds are you will only need to keep for or so blocks in memory at any given time. Add a LRU type cache, and you can bring that number of blocks up to ~100 and skip a lot of the disk accesses all together.
Edwin Buck
+2  A: 

You could consider using SetProcessWorkingSetSize . This would be useful in debugging, because your app will crash with a fatal exception when it runs out of memory instead of dragging your machine into a thrashing situation.

http://msdn.microsoft.com/en-us/library/ms686234%28VS.85%29.aspx

Similar SO question

http://stackoverflow.com/questions/192876/set-windows-process-or-user-memory-limit

Warren P