This issue originally came up in a related question of mine where I was having trouble reading some bit of code. The answer turned out to be that this line
&((GLushort *)0)[3 * mesh.sBoneBatches.pnBatchOffset[batchNum]]
evaluates to be a pointer. And it's used in
glDrawElements(GL_TRIANGLES, i32Tris * 3, GL_UNSIGNED_SHORT, &((unsigned short*)0)[3 * mesh.sBoneBatches.pnBatchOffset[batchNum]]);
where it is interpreted as an offset to draw a subset of vertex indices.
My code currently requires me to do by hand some of what openGL is doing in the glDrawElements, and I can't figure out how to use the pointer as an offset. glDrawElements uses an array of indices (named vertexIndices in my code), so I tried something like this:
vertexIndices[&((GLushort *)0)[3 * mesh.sBoneBatches.pnBatchOffset[batchNum]]]
but that obviously failed.
EDIT 1:
I just tried this and it compiles... still not sure if it's correct though. vertexIndices + (uint) &((GLushort *)0)[3 * mesh.sBoneBatches.pnBatchOffset[batchNum]]