views:

235

answers:

2

Hello,

I am running into an "out of memory" error from OpenGL on glReadPixels() under low-memory conditions. I am writing a plug-in to a program that has a robust heap mechanism for such situations, but I have no idea whether or how OpenGL could be made to use it for application memory management. The notion that this is even possible came to my attention through this [albeit dated] thread on a similar issue under Mac OS [not X]: http://lists.apple.com/archives/Mac-opengl/2001/Sep/msg00042.html

I am using Windows XP, and have seen it on multiple NVidia cards. I am also interested in any work-arounds I might be able to relay to users (the thread mentions "increasing virtual memory").

Thanks, Sean

A: 

Without more specifics, hard to say. Ultimately OpenGL is going to talk to the native OS when it needs to allocate. So if nothing else, you can always replace (or hook) the default CRT/heap allocator for your process, and have it fetch blocks from the "more robust" heap manager in the plugin host.

James D
+1  A: 

I'm quite sure that the out-of-memory error is not raised from glReadPixels (infact glReadPixels doesn't allocate memory itself).

The error is probably raised by other routines allocating buffer objects or textures. Once you detect the out-of-memory error, you should release all non-mandatory buffer objects (textures, texture mipmaps, rarely used buffer objects) is order to allocate a new buffer object holding the glReadPixels returned data.

Luca
You were completely correct! There was a bug in my code causing the error code to fall through to the error check on glReadPixels(). Thanks for breaking me out of my mental cage on this one..
spurserh
How can you be "quite sure" glReadPixels won't allocate anything ? It's totally implementation dependent.
Bahbar
@Bahbar: GL_OUT_OF_MEMORY doesn't mean that a malloc has failed! It's related to graphic memory, and each OpenGL call has defined well every error code which can returns! I'm "quite" sure because I cannot say that specific OpenGL implementation is compliant to specification; but if it is, glReadPixels cannot returns GL_OUT_OF_MEMORY.
Luca
huh ? The spec has this to say on the subject: "If memory is exhausted as a side effect of the execution of a command, theerror OUT_OF_MEMORY may be generated". It's very vague on purpose. I am however fully expecting that if a malloc fails in a GL implementation and it is properly implemented, it _will_ return GL_OUT_OF_MEMORY. What else could it do ?
Bahbar
Yes, You may be right. On memory allocation problems probably an error code is set, but the paper say very clear what errors are returned from each function, why don't mention OUT_OF_MEMORY? I don't remeber any point to this question.
Luca
_All_ GL entrypoints can return GL_OUT_OF_MEMORY. That's described in the "GL errors" 2.5 section.
Bahbar
Great! Afterwards, it is quite obvious. Say that it's more probable to fail a texture allocation than a client memory chunk...
Luca