I have a very odd problem, hopefully easily solved.
I am drawing a quad as a triangle strip. It simply has 2 triangles in the strip and I apply a texture containing an alpha channel to it.
For one of the triangles it seems the alpha is fine and exactly as I want it however for the first triangle it looks as though the alpha is either 1.0 or 0.0, rather than the correct in-between values.
What could be causing this? Have I assumed correctly that that's the problem? I shall attach an image so people can see what I mean:
The weird thing is this is drawn as a triangle array and from 1 texture so I am unsure how I can change settings or affect the texture in some way. I thought maybe the triangle was being drawn in a different rotation but they are both anti-clockwise.
Code wise drawn like so:
gl.glBindTexture(GL10.GL_TEXTURE_2D, texturePointer);
//Enable the vertices buffer for writing and to be used during our rendering
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
//Specify the location and data format of an array of vertex coordinates to use when rendering
gl.glVertexPointer(3, GL10.GL_FLOAT, 0, vertexBuffer);
//Enable the texture buffer
gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, textureBuffer);
gl.glDrawElements(GL10.GL_TRIANGLE_STRIP, indices.length, GL10.GL_UNSIGNED_SHORT, indexBuffer);
So I am a little stumped if I'm honest.
Edit: Looking at this a bit more it's as though one of the triangles is being textured twice so the alpha obviously adds up and it gets a bit more applied. This probably means I'm doing something wrong with my triangle strip. I make the strip like so (started hand writing it out to check..):
//Top left
vertices[vertPlace++] = 0.0f;
vertices[vertPlace++] = height;
vertices[vertPlace++] = 0.0f;
//Bottom left
vertices[vertPlace++] = 0.0f;
vertices[vertPlace++] = 0.0f;
vertices[vertPlace++] = 0.0f;
//top right
vertices[vertPlace++] = width;
vertices[vertPlace++] = height;
vertices[vertPlace++] = 0.0f;
//Bottom right
vertices[vertPlace++] = width;
vertices[vertPlace++] = 0.0f;
vertices[vertPlace++] = 0.0f;
//Now set the indices
indices[indiPlace++] = (short)0;
indices[indiPlace++] = (short)1;
indices[indiPlace++] = (short)2;
indices[indiPlace++] = (short)3;
//Top Left
textureCoords[textPlace++] = 0.0f;
textureCoords[textPlace++] = 1.0f;
///Bottom left
textureCoords[textPlace++] = 0.0f;
textureCoords[textPlace++] = 0.0f;
//Top right
textureCoords[textPlace++] = 1.0f;
textureCoords[textPlace++] = 1.0f;
//Bottom right
textureCoords[textPlace++] = 1.0f;
textureCoords[textPlace++] = 0.0f;
Somehow I'm doing something wrong there, I just can't see it.