views:

62

answers:

2

Hi!

The short question is: what can be the problem?

The overall memory usage of my program (shown by task manager) is almost the same all the time (near 40 minutes) it's running, and I have near 2G more free memory.

Running on win2003r2.

Memory allocation/freeing is high enough - I need to interact with other software, preparing data for it and delete it, when it's outdated. Number of data blocks is not constant.

Thank you!

+2  A: 

Typically there are only 2 reasons realloc will fail

  1. Not enough contiguous memory to satsify the request
  2. Memory corruption

Even though there is enough overall memory in your program to satisfy the request there may not be enough contiguous memory to do so due to fragmentation. The best way to determine this is to use a tool that can report on contiguous blocks to determine if one is available to satisfy your request. I believe one of the tools in the sysinternals package does so.

JaredPar
Thank you for your answer! I thought about memory fragmentation, but was not sure, it possible. Is there any methods to avoid it?And another question: can you show me any scenario, when memory corruption is in action?
zxcat
Easy, faulty hardware: the actual ram, overheating cpu, faulty motherboard circuits that handle transfers between ram and the cpu, or even the powersource can malfunction and shut down parts of your computer for very short periods of time. I had the last one in particular.
Blindy
Here is not about hardware memory corruption, I think :) But about writing to not owned memory. Or not?
zxcat
+1  A: 

With no code to look at, all I can give you is a workaround.

Try reallocing memory only when you need it to grow, and double it in size instead of just adding however many bytes you need. This helps tremendously with fragmentation. Since you said you have enough memory, don't worry about freeing it when you're done, just keep it there if it's reasonable enough.

Make it your goal to reduce fragmentation at any cost, keeping a 200mb working set seems perfectly fine to me for today's computing power. If you go past 500mb often and your program is ran for long periods of time, you can start looking into optimizing the working set further, but until then don't worry about it.

Blindy
I'll try it. Thank you for answer!
zxcat