views:

500

answers:

6

I am debugging a program that fails during a low memory situation and would like a C++ program that just consumes LOT of memory. Any pointers would help!

+2  A: 

I know it's a leak, but pointers will help :)

int main()
{
    for(;;)
    {
        char *p = new char[1024*1024];
    }
    // optimistic return :)
    return 0;
}
Dmitry
Allocating huge chunks will eventually fail. But does not help in making it fail in a controlled way. Also the OS will swap out pages to disk as it is not limited by physical RAM limits.
Martin York
+12  A: 

Are you on the Windows platform (looking at the username...perhaps not :) ) If you are in Windows land, AppVerifier has a low memory simulation mode. See the Low Resource Simulation test.

dirtybird
+1 I've not used AppVerifier itself, but something like this is a much better idea. You don't want everything on your system to suffer (like your debugger!) in your low member situation.
luke
+3  A: 

Just write a c++ app that creates a giant array

RHicke
You may need to disable the virtual memory manager (limit the OS to only using physical RAM). You could even go so far as to remove some RAM from your system, to speed up the point at which you're low on ram.
Mordachai
The trouble here it becomes difficult to write tests that fail consistently in any environment.
Martin York
+7  A: 

If you're using Unix or Linux, I'd suggest using ulimit:

bash$ ulimit -a
core file size        (blocks, -c) unlimited
data seg size         (kbytes, -d) unlimited
...
stack size            (kbytes, -s) 10240
...
virtual memory        (kbytes, -v) unlimited
Nick Dixon
A: 

A similar question was asked here and htis was my response. http://stackoverflow.com/questions/1229241/how-do-i-force-a-program-to-appear-to-run-out-of-memory/1229277#1229277

On Linux the command ulimit is probably what you want.

You'll probably want to use ulimit -v to limit the amount of virtual memory available to your app.

chollida
+6  A: 

Allcoating big blocks is not going to work.

  • Depending on the OS you are not limited to the actual physical memory and unused large chunks could be potentially just swap out to the disk.
  • Also this makes it very hard to get your memory to fail exactly when you want it to fail.

What you need to do is write your own version of new/delete that fail on command.

Somthing like this:

#include <memory>
#include <iostream>



int memoryAllocFail = false;

void* operator new(std::size_t size)
{
    std::cout << "New Called\n";
    if (memoryAllocFail)
    {   throw std::bad_alloc();
    }

    return ::malloc(size);
}

void operator delete(void* block)
{
    ::free(block);
}

int main()
{
    std::auto_ptr<int>  data1(new int(5));

    memoryAllocFail = true;
    try
    {
        std::auto_ptr<int>  data2(new int(5));
    }
    catch(std::exception const& e)
    {
        std::cout << "Exception: " << e.what() << "\n";
    }
}
> g++ mem.cpp
> ./a.exe
New Called
New Called
Exception: St9bad_alloc
Martin York
It is possible that simply linking a replacement version of malloc() that returns null on demand will give better coverage (or perhaps both just in case the built-in new does not use malloc() ) so that if any code uses malloc() will fail at the same time and so that if new(nothrow) is used it will still fail.
Clifford
It is not a requirement of new/delete to use malloc/free. So you will need to check your implementation documentation to see how it works under the hood. See http://stackoverflow.com/questions/240212/what-is-the-difference-between-new-delete-and-malloc-free/240308#240308 for more details
Martin York
Linking in a new lib with special version of malloc/free can be problomatic but it is not impossable. But technically the standard does not support such behavior (though comercial tools do exactly that (But they have dev teams working on the problem)).
Martin York
Is there a process to tell Visual Studio, Windows, GCC or Linux to restrict the executables memory environment? For example, we have an emulator for an embedded system which has limited memory, say 640MB. We want to have the emulator use the same memory restrictions (because debugging is easier with the emultor).
Thomas Matthews
Yes, Thomas. See the duplicate links I gave in this question's comments. Or see the answers here about AppVerifier and ulimit.
Rob Kennedy