views:

51

answers:

1

Hi.
I make an application which has to store a lot of data in memory to improve calculation performance.

It is a hierarchy of lists and objects where the top object is a QList<myObject*>. When loading data, a lot of instances of new myObject* are created and added to the list. The memory consumption grows and when it comes to ~1.9Gb the program crashes. My computer (Vista) has 4Gb RAM, and I have tested on other computers with less RAM (XP) and it crashes at the same point. Can I not use more than 1.9Gb RAM?

When a smaller file is loaded and memory usage according to "Windows task manager" is (say) 1.2Gb I can work with the data. But if I want to load another file, the growing starts from 1.2Gb even after calling delete on all objects and clearing the list. Why?
I tried switching to QVector and call squeeze(), but memory stays the same. I have read the other threads here about dynamic memory allocation in QLists, but is it really no way to reset the memory before I load a new file? Especially since it crashes after 1.9Gb; loading 3 small files sequentially and I'm there.

Thanks a lot for any suggestions.

+4  A: 

If you have 32-bit Windows, then your process can only use 2 GB of memory. You just cannot address more memory with 32 bits. If you need more memory, maybe you should change to 64-bit Windows.

Roku
For the record, on 64-bit Windows, 32-bit processes can use at most 4 GB of memory.
Francis Gagné