views:

498

answers:

1

I'm fairly new to OpenGL, and I seem to be experiencing some difficulties. I've written a simple shader in GLSL, that is supposed to transform vertices by given joint matrices, allowing simple skeletal animation. Each vertex has a maximum of two bone influences (stored as the x and y components of a Vec2), indices and corresponding weights that are associated with an array of transformation matrices, and are specified as "Attribute variables" in my shader, then set using the "glVertexAttribPointer" function.

Here's where the problem arises... I've managed to set the "Uniform Variable" array of matrices properly, when I check those values in the shader, all of them are imported correctly and they contain the correct data. However, when I attempt to set the joint Indices variable the vertices are multiplied by arbitrary transformation matrices! They jump to seemingly random positions in space (which are different every time) from this I am assuming that the indices are set incorrectly and my shader is reading past the end of my joint matrix array into the following memory. I'm not exactly sure why, because upon reading all of the information I could find on the subject, I was surprised to see the same (if not very similar) code in their examples, and it seemed to work for them.

I have attempted to solve this problem for quite some time now and it's really beginning to get on my nerves... I know that the matrices are correct, and when I manually change the index value in the shader to an arbitrary integer, it reads the correct matrix values and works the way it should, transforming all the vertices by that matrix, but when I try and use the code I wrote to set the attribute variables, it does not seem to work.

The code I am using to set the variables is as follows...

// this works properly...
GLuint boneMatLoc = glGetUniformLocation([[[obj material] shader] programID], "boneMatrices");
glUniformMatrix4fv( boneMatLoc, matCount, GL_TRUE, currentBoneMatrices );

GLfloat testBoneIndices[8] = {1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0};

// this however, does not...
GLuint boneIndexLoc = glGetAttribLocation([[[obj material] shader] programID], "boneIndices");
glEnableVertexAttribArray( boneIndexLoc );
glVertexAttribPointer( boneIndexLoc, 2, GL_FLOAT, GL_FALSE, 0, testBoneIndices );

And my vertex shader looks like this...

// this shader is supposed to transform the bones by a skeleton, a maximum of two
// bones per vertex with varying weights...

uniform mat4 boneMatrices[32];  // matrices for the bones
attribute vec2 boneIndices;  // x for the first bone, y for the second

//attribute vec2 boneWeight;  // the blend weights between the two bones

void main(void)
{
 gl_TexCoord[0] = gl_MultiTexCoord0; // just set up the texture coordinates...


 vec4 vertexPos1 = 1.0 * boneMatrices[ int(boneIndex.x) ] * gl_Vertex;
 //vec4 vertexPos2 = 0.5 * boneMatrices[ int(boneIndex.y) ] * gl_Vertex;

 gl_Position = gl_ModelViewProjectionMatrix * (vertexPos1);
}

This is really beginning to frustrate me, and any and all help will be appreciated,

-Andrew Gotow

A: 

Ok, I've figured it out. OpenGL draws triangles with the drawArrays function by reading every 9 values as a triangle (3 vertices with 3 components each). Because of this, vertices are repepated between triangles, so if two adjacent triangles share a vertex it comes up twice in the array. So my cube which I originally thought had 8 vertices, actually has 36!

six sides, two triangles a side, three vertices per triangle, all multiplies out to a total of 36 independent vertices instead of 8 shared ones.

The entire problem was an issue with specifying too few values. As soon as I extended my test array to include 36 values it worked perfectly.

Andrew Gotow