views:

69

answers:

1

OK so I have 4 buffers, 3 FBOs and a render buffer. Let me explain.

I have a view FBO, which will store the scene before I render it to the render buffer.

I have a background buffer, which contains the background of the scene.

I have a user buffer, which the user manipulates.

When the user makes some action I draw to the user buffer, using some blending.

Then to redraw the whole scene what I want to do is clear the view buffer, draw the background buffer to the view buffer, change the blending, then draw the user buffer to the view buffer. Finally render the view buffer to the render buffer.

However I can't figure out how to draw a FBO to another FBO. What I want to do is essentially merge and blend two FBOs, but I can't figure out how! I'm very new to OpenGL ES, so thanks for all the help.

+1  A: 

Set up your offscreen framebuffers to render directly to a texture. This link shows you how:

http://developer.apple.com/iphone/library/documentation/3DDrawing/Conceptual/OpenGLES_ProgrammingGuide/WorkingwithEAGLContexts/WorkingwithEAGLContexts.html#//apple_ref/doc/uid/TP40008793-CH103-SW7

Let me take a moment to describe framebuffers and renderbuffers, for my benefit and yours. A framebuffer is like a port that accepts OpenGL rendering commands. It has to be attached to a texture or a renderbuffer before you can see or use the rendering output. You can choose between attaching a texture using glFramebufferTexture2DOES or a renderbuffer using glFramebufferRenderbufferOES. A renderbuffer is like a raster image that holds the results of rendering. Storage for the raster image is managed by OpenGL. If you want the image to appear on the screen instead of an offscreen buffer, you use -[EAGLContext renderBufferStorage:fromDrawable:] to use the EAGLContext's storage with the renderbuffer. This code is in the OpenGL ES project template.

You probably don't need the view framebuffer, since after rendering the scene background and the user layer to textures, you can draw those textures into the renderbuffer (that is, into the framebuffer associated with the onscreen renderbuffer).

codewarrior
Thanks after looking around forever I just discovered this answer. For some reason I just assumed that would be very slow.
DevDevDev
What I found to be really slow is full screen alpha blending. Blending 4 layers together brought me down to 15 FPS. I had to carefully turn off blending when I didn't need it.
codewarrior
Thanks for the tip. Only small parts will change between frames, is there an easy way to clear and redraw parts of the frame. For instance, say you have a rectangle drawing program. On user input (touchesEnded). You compute the rectanglular area where the user made changes, clear the pixels there, redraw the background in that rectangle, and then draw the user action? I mean I could do something manually, but I would expect OpenGL to have this.Also is it possible to merge two render buffers together? Sorry for asking so many questions, but most tutorials are quite poorly written...
DevDevDev
That algorithm is sometimes called "dirty rects." It's outside the scope of graphics libraries like OpenGL and usually implemented in layout or view management. You'll have to calculate which rects need to be redrawn based on changes in your application state. Once you've calculated your dirty rects, you can use glScissor to restrict the drawing area and save GPU time.
codewarrior