views:

87

answers:

3

My OpenGL application which was working fine on ATI card stopped working when I put in an NVIDIA Quadro card. Texture simply don't work at all! I've reduced my program to a single display function which doesn't work:

void glutDispCallback()
{
//ALLOCATE TEXTURE
unsigned char * noise = new unsigned char [32 * 32 * 3];
memset(noise, 255, 32*32*3);

glEnable(GL_TEXTURE_2D);
GLuint textureID;
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_DECAL);
glGenTextures(1, &textureID);
glBindTexture(GL_TEXTURE_2D, textureID);
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glPixelStorei(GL_PACK_ALIGNMENT, 1);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, 32, 32, 0, GL_RGB, GL_UNSIGNED_BYTE, noise);
    delete [] noise;

//DRAW
glDrawBuffer(GL_BACK);
glViewport(0, 0, 1024, 1024);
setOrthographicProjection();
glClear( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT );
glLoadIdentity();
glDisable(GL_BLEND);
glDisable(GL_LIGHTING);
glBindTexture(GL_TEXTURE_2D, textureID);
glColor4f(0,0,1,0);
glBegin(GL_QUADS);
    glTexCoord2f(0,0); 
    glVertex2f(-0.4,-0.4);

    glTexCoord2f(0, 1); 
    glVertex2f(-0.4, 0.4);

    glTexCoord2f(1, 1); 
    glVertex2f(0.4, 0.4);

    glTexCoord2f(1,0); 
    glVertex2f(0.4,-0.4);

glEnd();
glutSwapBuffers();

//CLEANUP
GL_ERROR();
glDeleteTextures(1, &textureID);
}

The result is a blue quad (or whatever is specified by glColor4f()), and not a white quad which is what the texture is. I have followed the FAQ on OpenGL site. I have disabled blending in case texture was being blended out. I have disabled lighting. I have looked through glGetError() - no errors. I've also set glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE); and GL_DECAL. Same result. I've also tried different polygon winding - CW and CCW.

Anyone else encounter this?

+1  A: 

Can you try using GL_REPLACE in glTexEnvi? It could be a bug in the NV driver.

Matias Valdenegro
I have tried that. Just modified my post. I've also tried GL_MODULATE.
Budric
+1  A: 

Your code is correct and does what it should.

memset(noise, 255, 32*32*3); makes the texture white, but you call glColor4f(0,0,1,0); so the final color will be (1,1,1)*(0,0,1) = (0,0,1) = blue.

What is the behavior you would like to have ?

Calvin1602
Given that GL_TEXTURE_ENV_MODE is set to GL_REPLACE, and GL_BLEND is disabled, I expect the color blue to be ignored and texture color to be used in its place. Also I don't see why your color multiplication is operation is correct. Under what mode does OpenGL do that? It's not any blending I know of.
Budric
Yes that's the problem, with GL_REPLACE the fragment color should be the texture color directly.
Matias Valdenegro
+1  A: 

I found the error. Somewhere else in my code I had initialized a GL_TEXTURE_3D object and had not called glDisable(GL_TEXTURE_3D);

Even though I had called glBindTexture(GL_TEXTURE_2D, textureID); it should have bound a 2D texture as the current texture and used that - as this code always worked on ATI cards. Well apparently the nVidia driver wasn't doing that - it was using that 3D texture for some reason. So adding glDisable(GL_TEXTURE_3D); fixed the problem and everything works as expected.

Thanks all who tried to help.

Budric
For what it's worth, the nvidia driver does the correct thing. If you have multiple targets enabled (here 3d and 2d), the rule is that the 3d takes precendence over 2d. Cube > 3D > 2D > 1D. See http://stackoverflow.com/questions/2784937/what-is-the-correct-behavior-when-both-a-1d-and-a-2d-texture-are-bound-in-opengl/2786948#2786948 for the full details
Bahbar