I'm currently working on a C++ program in Windows XP that processes large sets of data. Our largest input file causes the program to terminate unexpectedly with no sort of error message. Interestingly, when the program is run from our IDE (Code::Blocks), the file is processed without any such issues.
As the data is being processed, it's placed into a tree structure. After we finish our computations, the data is moved into a C++ STL vector before being sent off to be rendered in OpenGL.
I was hoping to gain some insight into what might be causing this crash. I've already checked out another post which I can't post a link to since I'm a new user. The issue in the post was quite similar to mine and resulted from an out of bounds index to an array. However, I'm quite sure no such out-of-bounds error is occurring.
I'm wondering if, perhaps, the size of the data set is leading to issues when allocating space for the vector. The systems I've been testing the program on should, in theory, have adequate memory to handle the data (2GB of RAM with the data set taking up approx. 1GB). Of course, if memory serves, the STL vectors simply double their allocated space when their capacity is reached.
Thanks, Eric