views:

299

answers:

4

I have a Win XP 32 bit machine . I am using Visual C++ language to test this scenario . I create a vector like vector<__int64> v . I take a note of virtual memory , say its 400 KB now. Then I push around 5 million integers in it . Now I note the virtual memory , its increased to say around 900 KB . Now I call a erase or clear on the vector . Now I check the virtual memory its 600 KB .

I try the same scenario with vector of int v . This time I get the exact amount of memory before populating the vector and after flushing the vector .

Why is there a difference between memory ?

From comments: SmartHeap is used.

+3  A: 

The runtime will not always release memory back to the OS, it'll keep memory around in case it's needed in the future. Sometimes the memory is quite a bit fragmented as well, so there's no easy way to release it back to the OS.

The allocator will usually reserve memory in (bigger than your program allocates) chunks, so looking at memory usage might not reflect the memory allocation and deallocation of your program.

It's all virtual memory anyhow, so it doesn't matter as much as exhausting the physical memory of your machine.

nos
It does matter ...since later in my application I run out of virtual memory
sameer karjatkar
Fair enough, but nos's main point is that memory doesn't necessarily get released to the OS just because you've released it from your control. Memory management is complex and very rarely if ever as straightforward as allocate X, get X back when you deallocate.
quark
These are valid points. The small experiment showing potentially unreleased memory may have nothing to do with running out of Virtual Memory (which could be a leak in the code somewhere).
nik
@qark Why do I get the exact memory back when I fiddle around with ordinary int ?
sameer karjatkar
@nik what could be the memory leak in a 5 liner code int _tmain(int argc, _TCHAR* argv[]){ vector<int> v; int cnt; for(cnt=0;cnt<552000;cnt++) v.push_back(cnt); v.clear(); v.empty(); return 0;}
sameer karjatkar
+3  A: 

In addition to answer that memory is not always released to the OS, your tests might also be affected by the fact that erasing elements from the vector or clearing the vector does not reduce the size of buffer allocated by the vector.

To guarantee the release of the memory, make sure your vector goes out of scope, or use a temporary vector:

{
    std::vector<__int64> temp;
    v.swap(temp);
}
Sergii Volchkov
That's obvious the reduced memory is only after it goes out of scope . That's the results I have mentioned
sameer karjatkar
A: 

The main reason is memory fragmentation that do not allows runtime to release all virtual memory. You should consider using custom allocator for such big allocations. Check out boost::pool_alloc as a very good implementation of custom allocator.

Using Pools gives you more control over how memory is used in your program. For example, you could have a situation where you want to allocate a bunch of small objects at one point, and then reach a point in your program where none of them are needed any more. Using pool interfaces, you can choose to run their destructors or just drop them off into oblivion; the pool interface will guarantee that there are no system memory leaks.

Kirill V. Lyadvinsky
@Jla3ep I am using SmartHeap
sameer karjatkar
May be then it is SmartHeap implementation issue?
Kirill V. Lyadvinsky
A: 

Guys I got the issue here .

For __int64 If I reserve some bytes for the vector in first place and then do the insertions and then clear the vector I get the original memory back . However what puzzles me as to why this approach is not required for a normal vector if int .

sameer karjatkar
I had this thought. If you reserve a single allocation is made from the heap. This does not fragment and the clear will be able to release to heap easily.
nik