views:

84

answers:

2

I'm learning to use normal maps (per pixel lighting?) in 2D graphics with OpenGL.

New to normal mapping, I managed to wrap my head around the Sobel operator and the generation of normal maps (mostly thanks to this), that is creating a (2D) array of normals from a (2D) array of pixel data.

(Most of the tutorials and forum threads that I have found were specific to 3D uses and modelling software. I aim to implement this functionality myself, in C++.)

  • What do I do once I've got the normal map?
  • Do I need to register it with OpenGL?
  • Does it need to be associated with the texture, if yes, how is it done?
  • How is it mapped to a 2D textured quad?
  • (Is this something that I can do without shaders / GLSL?)
+3  A: 
  • What do I do once I've got the normal map?
  • Do I need to register it with OpenGL?

Yes, you need to load it as a texture.

  • Does it need to be associated with the texture, if yes, how is it done?

If you mean associated with the color texture, then no. You need to create a texture that holds the normal map in order to use it later with OpenGl.

  • How is it mapped to a 2D textured quad?

Your normal map is just another texture, you bind it and map as any other texture.

Normal map stores the normals in tangent space coordinates, so to calculate the lighting per pixel you need to know the relative position of the light source in tangent space coordinate system. This is done by setting additional parameters per vertex (normal, tangent, binormal), calculating the light-source position in tangent space coordinates and interpolating this position along the triangles. In the fragment shader you lookup the normal in the normal map and perform the desired lighting calculation based on the interpolated parameters.

  • (Is this something that I can do without shaders / GLSL?)

Yes, you can use some legacy extensions to program the multi-texture environment combination functions. Never done it myself but it looks like hell.

ybungalobill
great, multitexturing is what I had on my mind (and yes, I'll go with shaders=) ). thanks!
iCE-9
+2  A: 

I recommend you look at:

This nvidia presentation on bumb mapping

I haven't looked at this for a while, but I remember it going over most of the details in implementing a bump map shader, should get a few ideas running.

This other nvidia tutorial for implementing bump mapping in the cg shader langauge

This bump mapping tutorial might also be helpful.

I know all these are not for full normal mapping but they're a good start.

Also, while there are differences in shader languages it shouldn't be to hard to convert formulers between them if you want to use GLSL.

As ybungalobill said, you can do it without shaders but unless you are working on an educational project (for your education) or a particular embedded device, I have no idea why the hell you would want to - but if you do need to this is where you want to look, it was written before shaders, and updated to reference them later.

Tomas Cokis
Thanks, the nVidia presentation and the 3DKingdoms ones are great. I've seen paulsprojects one, but I was lost when it started on cube maps.
iCE-9