I have an application that periodically needs to process large blocks of data with a computationally trivial algorithm. It turns out I can also prevent slowing down the system from hard drive accesses by keeping the blocks of data in a memory cache. The application is a low-priority application so I'm working to minimize its impact on the system across the board, which means using extra memory (when available) to reduce the load on the CPU and hard drives. The cached data is just 64MB blocks of bytes, and the more of them I have in memory the less overhead the program will have on the drives.
What I need to do is dump the in-memory cache whenever any other application on the system needs more physical memory than is available, and do so fast enough that the user never feels the system slowing down due to high memory demands.
I'm particularly interested in how this would be accomplished in a .NET application.