views:

28

answers:

1

I'll try to keep this simple.

I want a way to access the normal information of the scene, from the Frame Buffer output (or similar). The same way one is able to access the Depth Buffer using glGetTexImage and GL_DEPTH_COMPONENT.

I know I could set up a fragment shader which outputs the normal information in RGB color space, which could in turn be read from the rendered image. I'm wondering however if there is a way to do this within the openGL API.

I'll clarify anything upon request as best as I can, Thank you

+1  A: 

You already know the solution: Render the normal as RGB. There's no built-in normal buffer you could use. If you don't want to render your scene twice, use framebuffer objects (FBO) with multiple render targets (MRT). Then you can write both color and normal into separate textures in your fragment shader.

Malte Clasen
Thanks. I was hoping for some kind of extension to openGL which put the normal as color output from the fragment shader. However, after learning some GLSL, this wasn't hard to do myself. So thanks for the clarification. I don't think I'll be needing FBO|MRT, as it's for simulation/research purposes and I don't need the normals every frame. However, it's nice to know where to improve the speed if needed.
okamiueru