tags:

views:

882

answers:

3
+1  A: 

Are there any overlapping primitives in your scene?

You are aware that you're calling the 3-parameter version of glColor, which sets the alpha to 1.0, right?

It could be helpful if you could post a screenshot, or otherwise describe what happens, say, when you draw two primitives with identical colors and differing alphas. In fact, any code demostrating the problem could help.

Edit:

I'd imagine that using TexImage with GL_RGB (for internalformat, the 3rd parameter) creates a 3-component texture with no alpha or alpha values implicitly initialized to 1, no matter what kind of pixel data you supply.

GL_BGR is not a valid value for this parameter, perhaps it is tricking your implementation into using a full 4-byte internal format? (Or a 2-byte one, as per GL_LUMINANCE_ALPHA) Or do you mean passing GL_BGR to your Texture::Load() function, which should not really be different from passing GL_RGB?

aib
Please see additional comments, thanks :)
Mark
If it was tricking it into using a 4-byte internal format, there wouldn't be a problem. I think the problem with GL_RGB as I had originally was only making it a 3-byte internal format, when I really needed 4. I meant that GL_BGR worked in my load function, even when I used GL_RGB in glTexImage2D. I think it only matters if the correct number of bytes is passed to glTexImage2D. It would probably be less ambiguous if I just passed '4'.
Mark
A: 

I think this should work, but it assumes the image has an alpha channel. If you try and load an image without an alpha channel you will get an exception or your application might crash. For non-alpha channel images use GL_RGB instead of GL_RGBA on the second parameter, right before setting the GL_UNSIGNED_BYTE. Hope this helps,

void Texture::Load(unsigned char* data) { glEnable(GL_TEXTURE_2D); glBindTexture(GL_TEXTURE_2D, _texID); glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, tw, th, 0, GL_RGBA, GL_UNSIGNED_BYTE, data); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE); glDisable(GL_TEXTURE_2D); }

-Alessandro

A: 

In the interest of others, I post my solution here.

The problem was that although my Load function was correct,

glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, _w, _h, GL_RGBA, GL_UNSIGNED_BYTE, data);

I was passing GL_RGB to this function

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, tw, th, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, NULL);

Which also needs to specify the correct number of bytes (four). From my understanding you can't use a different number of bytes for a SubImage, although I think you can use a different format if it does have the same number of bytes (i.e. mixing GL_RGB and GL_BGA is okay, but not GL_RGB and GL_RGBA).

Mark