views:

453

answers:

2

Is it possible to disable texture colors, and use only white as the color? It would still read the texture, so i cant use glDisable(GL_TEXTURE_2D) because i want to render the alpha channels too.

All i can think of now is to make new texture where all color data is white, remaining alpha as it is.

I need to do this without shaders, so is this even possible?

Edit: to clarify: i want to use both textures: white and normal colors.

Edit: APPARENTLY THIS IS NOT POSSIBLE

+1  A: 

What about changing all texture color (except alpha) to white after they are loaded and before they are utilized in OpenGL? If you have them as bitmaps in memory at some point, it should be easy and you won't need separate texture files.

Lukas Pokorny
but i need to use the textures too. i want an option to display it white.
Newbie
@Newbie, make a copy of the texture. It will take twice the memory, which I know you are trying to avoid, but at least you only need to load one copy from disk, and it's DRY.
Matthew Crumley
+2  A: 

I'm still not entirely sure I understand correctly what you want, but I'll give it a shot. What I had in mind is that when you call glTexImage2D, you specify the format of the texels you're loading and you specify the "internal format" of the texture you're creating from those texels. In a typical case, you specify (roughly) the same format for both -- e.g. you'll typically use GL_RGBA for both.

There is a reason for specifying both though: the format (the parameter close to the end of the list) specifies the format of the texels you're loading from, but the internal format (the one close to the beginning of the parameter list) specifies the format for the actual texture you create from those texels.

If you want to load some texels, but only actually use the alpha channel from them, you can specify GL_ALPHA for the internal format, and that's all you'll get -- when you map that texture to a surface, it'll affect only the alpha, not the color. This not only avoids making an extra copy of your texels, but (at least usually) reduces the memory consumed by the texture itself as well, since it only includes an alpha channel, not the three color channels.

Edit: Okay, thinking about it a bit more, there's a way to do what (I think) you want, using only the one texture. First, you set the blend function to just use the alpha channel, then when you want to copy the color of the texture, you call glTextureEnvf with GL_REPLACE, but when you only want to use the alpha channel, you call it with GL_BLEND. For example, let's create a green texture, and draw it (twice) over a blue quad, once with GL_REPLACE, and one with GL_BLEND. For simplicity, we'll use a solid gree texture, with alpha increasing linearly from top (0) to bottom (1):

static GLubyte Image[128][128][4];

for (int i=0; i<128; i++)
    for (int j=0; j<128; j++) {
        Image[i][j][0] = 0;
        Image[i][j][1] = 255;
        Image[i][j][2] = 0;
        Image[i][j][3] = i;
    }

I'll skip over most of creating and binding the texture, setting the parameters, etc., and get directly to drawing a couple of quads with the texture:

glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);

glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);

glBegin(GL_QUADS);
    glColor4f(0.0f, 0.0f, 1.0f, 1.0f);
    glTexCoord2f(0.0, 0.0); glVertex3f(-1.0f, 1.0f, 0.0f);
    glColor4f(0.0f, 0.0f, 1.0f, 1.0f);
    glTexCoord2f(1.0, 0.0); glVertex3f(0.0f, 1.0f, 0.0f);
    glColor4f(0.0f, 0.0f, 1.0f, 1.0f);
    glTexCoord2f(1.0, 1.0); glVertex3f(0.0f, -1.0f, 0.0f);
    glColor4f(0.0, 0.0f, 1.0f, 1.0f);
    glTexCoord2f(0.0, 1.0); glVertex3f(-1.0f, -1.0f, 0.0f);
glEnd();

glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_BLEND);

glBegin(GL_QUADS);
    glColor4f(0.0f, 0.0f, 1.0f, 1.0f);
    glTexCoord2f(0.0, 0.0); glVertex3f(0.0f, 1.0f, 0.0f);
    glColor4f(0.0f, 0.0f, 1.0f, 1.0f);
    glTexCoord2f(1.0, 0.0); glVertex3f(1.0f, 1.0f, 0.0f);
    glColor4f(0.0f, 0.0f, 1.0f, 1.0f);
    glTexCoord2f(1.0, 1.0); glVertex3f(1.0f, -1.0f, 0.0f);
    glColor4f(0.0, 0.0f, 1.0f, 1.0f);
    glTexCoord2f(0.0, 1.0); glVertex3f(0.0f, -1.0f, 0.0f);
glEnd();

This produces:

Blended Textures

So, on the left where it's drawn with GL_REPLACE, we get the green of the texture, but on the right, where it's drawn with GL_BLEND (and glBlendFunc was set to use only the alpha channel) we get the blue quad, but with its Alpha taken from the texture -- but we use exactly the same texture for both.

Edit 2: If you decide you really do need a texture that's all white, I'd just create a 1-pixel texture, and set GL_TEXTURE_WRAP_S and GL_TEXTURE_WRAP_T to GL_REPEAT. Even though this is still using an extra texture, that extra texture will be really tiny, since it's only one pixel. Both the time to load it and the memory consumed will be truly minuscule -- the data for the pixel is basically "noise". I haven't tried to test, but you might be better off with something like an 8x8 or 16x16 block of pixels instead. This is still so small it hardly matters, but those are the sizes used in JPEG and MPEG respectively, and I can see where the card and (especially) driver might be optimized for them. It might help, and won't hurt (enough to care about anyway).

Jerry Coffin
i dont understand how im supposed to read only alpha channel on run time, code pls. im using code glBindTexture(GL_TEXTURE_2D, tex); to bind the texture, what else i need to read only alpha? im using glTexImage2D(target, 0, format, width, height, 0, format, GL_UNSIGNED_BYTE, pixels); to save the texture on my GPU. this cannot be used on run time, i cant make double of this function or it would create another texture.
Newbie
how i revert GL_TEXTURE_ENV_MODE back to what it was before changes? also there is one problem, i must make this work with and without GL_BLEND, where i can see white color without alpha, and with alpha.
Newbie
To get back to GL_REPLACE, you'd call `glTexEnvf` with `GL_REPLACE` again. I don't think I understand the rest of what you want, so I can't really answer. If you just want a white block, without using the colors or the alpha from the texture, why use a texture at all? That's just a GL_QUAD with the color set to white...
Jerry Coffin
ah you are right, i will disable GL_TEXTURE_2D for those. ill see if i can get this other code work. by reverting GL_TEXTURE_ENV_MODE back i meant that how i set it back to what it was before my edits, so rest of my scene will be rendered normally.
Newbie
doesnt seem to work, it makes my texture colors negative o_O
Newbie
Your "edit2" doesnt make sense, i am using alpha channel to create shapes, if i remove alpha channel, i lose all shapes for my sprites.
Newbie
@Newbie:Perhaps you should have mentioned that in the first place...
Jerry Coffin