views:

1109

answers:

2

I'm using QT's QGLFramebufferObject for off-screen rendering.
After rendering to the buffer I read the result using glReadPixels()
The problem is that sometimes the background color I read is just 0 (transparent black) and sometimes it is 0xFF000000 (opaque black)
This seem to be related to the time the buffer is initialized. If the buffer is a member of the class inherited from QGLWidget then it is 0. If it is initialized anywhere else it's 0xFF000000.

Does anybody have any idea what's going on here? What is the good expected result of this read?

+1  A: 

Do you set up the correct clear color (glClearColor) and actually do a clear (glClear)? Are you making sure that your color write (glColorMask) mask is fully enabled (it also affects clears). Next you can check if QT sets up some weird pixel copy transfer (other than the default, see glPixelStore, glPixelTransfer and glPixelMap). Are you sure you are getting (and reading into) an RGBA buffer in both cases, not just RGB? Lastly, it is very well possible that the framebuffer object extension is buggy on your particular graphics card/driver combination, more so if there is no primary visible window. At least check cards from both vendors. And of course always check glGetError after all potentially failing operations. After re-reading your question, do you get any other valid rendering in the non QGLWidget case at all? Does clear to say green, return green? Because it might very well be that you just did not initialize any valid OpenGL context to begin with in that case. Framebuffer objects, unlike pbuffers need an external context.

starmole
A: 

Mystery solved
It seems that the clear color I give to the GLWidget is 0 and the default clear color is 0xff000000. Depending on when I initialize the framebuffer object, it receives the GLWidget's current clear color.

shoosh