I was learning about how to use textures with GLSL (in LWJGL) to create a simple fragment shader that blurs a texture. The first attempt (for testing purposes) was a very simple shader which just takes the original color values:
uniform sampler2D rectTexture;
void main(){
vec4 color = texture2D(rectTexture, gl_FragCoord.st);
gl_FragColor = color;
}
The shader compiles fine. After compilation I link and start to use it, up to this everything's working and no errors are reported. Then I try to set the uniform variable for the texture:
uniformTextureAddr = ARBShaderObjects.glGetUniformLocationARB(programObject, "rectTexture");
ARBMultitexture.glActiveTextureARB(ARBMultitexture.GL_TEXTURE0_ARB);
GL11.glBindTexture(GL11.GL_TEXTURE_2D, texture);
ARBShaderObjects.glUniform1iARB(uniformTextureAddr, 0);
Then I just draw a quad with normal texcoords (0.0f-1.0f on both dimensions) but the texture doesn't show up. The texture itself is not the problem; I draw a second quad next to the first one without my fragment shader an it shows up as you would expect. The basic approach is taken from NeHe: GLSL - An Introduction. uniformTextureAddr
is not -1 and if I use an even simpler shader, which just turns every pixel red, I get a red quad. As expected. So the bug has to be in the whole sampler2D business. Trivial mistakes, such as the quad simply being out of frame, are also ruled out by this.
And yes, after drawing without the shader I call glUseProgramObject(programObj) with my shader again.
By the way, this is running on Windows XP SP3 with a ATI Radeon Catalyst 10.6 driver and LWJGL version 2.4.2.
UPDATE: I think something might be wrong with the program itself. When I add another variable to the shader:
uniform sampler2D secondTexture;
uniform sampler2D rectTexture;
void main(){
vec4 color = texture2D(rectTexture, gl_FragCoord.st);
gl_FragColor = color;
}
and call glGetUniformLocationARB(programObject, "secondTexture");
it just returns -1, even though it should be there. Log info still only says:
Info log: Fragment shader was successfully compiled to run on hardware.
UPDATE 2:
The actual texture is copied from the backbuffer. A simple white line is drawn in a smaller viewport and then copied to a texture:
GL11.glViewport(0, 0, 256, 256);
GL11.glDisable(GL11.GL_TEXTURE_2D);
GL11.glColor3f(1.0f, 1.0f, 1.0f);
GL11.glBegin(GL11.GL_LINES);
GL11.glVertex3f(0.0f, 0.0f, 0.0f);
GL11.glVertex3f(256.0f, 256.0f, 0.0f);
GL11.glEnd();
GL11.glEnable(GL11.GL_TEXTURE_2D);
GL11.glCopyTexImage2D(GL11.GL_TEXTURE_2D, 0, GL11.GL_LUMINANCE, 0, 0, 256, 256, 0);
GL11.glClear(GL11.GL_COLOR_BUFFER_BIT | GL11.GL_DEPTH_BUFFER_BIT);
GL11.glViewport(0, 0, WINDOW_WIDTH, WINDOW_HEIGHT);
But like I said, I don't think the texture is the actual problem, since it shows up fine on the second quad without my custom shader. Also, don't worry about binding; this is the only texture in the program and is only bound once at the very beginning.
Here is my drawing code:
ARBShaderObjects.glUseProgramObjectARB(programObject);
GL11.glBegin(GL11.GL_QUADS);
GL11.glTexCoord2f(0.0f, 1.0f);
GL11.glVertex3f(0.0f, 0.0f, 0.0f);
GL11.glTexCoord2f(1.0f, 1.0f);
GL11.glVertex3f(256.0f, 0.0f, 0.0f);
GL11.glTexCoord2f(1.0f, 0.0f);
GL11.glVertex3f(256.0f, 256.0f, 0.0f);
GL11.glTexCoord2f(0.0f, 0.0f);
GL11.glVertex3f(0.0f, 256.0f, 0.0f);
GL11.glEnd();
ARBShaderObjects.glUseProgramObjectARB(0);
GL11.glBegin(GL11.GL_QUADS);
GL11.glTexCoord2f(0.0f, 1.0f);
GL11.glVertex3f(256.0f, 0.0f, 0.0f);
GL11.glTexCoord2f(1.0f, 1.0f);
GL11.glVertex3f(512.0f, 0.0f, 0.0f);
GL11.glTexCoord2f(1.0f, 0.0f);
GL11.glVertex3f(512.0f, 256.0f, 0.0f);
GL11.glTexCoord2f(0.0f, 0.0f);
GL11.glVertex3f(256.0f, 256.0f, 0.0f);
GL11.glEnd();