In the app I'm writing, I use a lot of in memory containers (C++ std containers but I don't think that's relevant).
During one "task" of my app, in a heavy usage edge-case the private bytes memory usage hits 1GB.
Just as a bit of context, this task is a user initiated task involving 100,000s files. It's likely that the user will kick this off and then leave the machine running.
(And no I don't do anything dumb like load files into memory - this footprint is all metadata related to the task in progress).
For most users the memory usage during this task is negligable - it's just the 1% of users who want to do 500,000 "things" insted of 5000 "things".
I was about to embark on a process to move a lot of this in-memory stuff to disk somehow, e.g. scratch file, embedded DB.
But then I thought - "hang on a minute. All of these solutions are essentially caching memory to disk. Isn't that what Virtual Memory is for?".
I'm not interested in persisting this data - it's purely scratch/temporary stuff I need access to while the task is running.
So my question is, what should I do?
I don't want to do a major refactor for that 1%, but I want to know the impact of running an app with that high a memory footprint.
Am I right in saying that I probably wouldn't be able to do much better than the Windows VM manager anyway?
Under what conditions does this become harmful? OK so yes, if I used up all the real memory then it'd be thrashing to reload pages. But wouldn't I have that anyway in the case if e.g. an embedded database?
Cheers,
John