tags:

views:

184

answers:

3

This issue originally came up in a related question of mine where I was having trouble reading some bit of code. The answer turned out to be that this line

&((GLushort *)0)[3 * mesh.sBoneBatches.pnBatchOffset[batchNum]]

evaluates to be a pointer. And it's used in

glDrawElements(GL_TRIANGLES, i32Tris * 3, GL_UNSIGNED_SHORT, &((unsigned short*)0)[3 * mesh.sBoneBatches.pnBatchOffset[batchNum]]);

where it is interpreted as an offset to draw a subset of vertex indices.

My code currently requires me to do by hand some of what openGL is doing in the glDrawElements, and I can't figure out how to use the pointer as an offset. glDrawElements uses an array of indices (named vertexIndices in my code), so I tried something like this:

vertexIndices[&((GLushort *)0)[3 * mesh.sBoneBatches.pnBatchOffset[batchNum]]]

but that obviously failed.

EDIT 1: I just tried this and it compiles... still not sure if it's correct though. vertexIndices + (uint) &((GLushort *)0)[3 * mesh.sBoneBatches.pnBatchOffset[batchNum]]

+4  A: 
Heath Hunnicutt
worked like a charm, thank you
spencewah
You're welcome. Happy drawing! :)
Heath Hunnicutt
+2  A: 

What's being passed to glDrawElements is a pointer to an offset, not a pointer as an offset. Your mysterious line of code is equivalent to:

size_t offset = ((GLushort*)0)[3 * mesh.sBoneBatches.pnBatchOffset[batchNum]];
glDrawElements(GL_TRIANGLES, i32Tris * 3, GL_UNSIGNED_SHORT, &offset);

In other words, glDrawElements() isn't using the fourth parameter as an offset directly, but a pointer to an offset value. This is clear from the OpenGL documentation, which has the following signature for glDrawElements:

  void glDrawElements( GLenum mode,
         GLsizei count,
         GLenum type,
         const GLvoid *indices );

And the explanation:

indices - Specifies a pointer to the location where the indices are stored.

(Emphasis mine.)

Aside: The offset calculation is a little weird, but valid for the reasons outlined in the answer to your other question. It's clearer and more stylistically normal to write that as:

size_t offset = sizeof(GLushort) * 3 * mesh.sBoneBatches.pnBatchOffset[batchNum];
JSBangs
Ah, sorry, I should mention that I was using Vertex Buffer Objects, and in that particular case (because it's a server side calculation) openGL uses pointers AS offsets, instead of using the actual pointer as it would in client side calculations (like glDrawArrays)
spencewah
From opengl.org: RESOLVED: When the default buffer object (object zero) is bound, all pointers behave as real pointers. When any other object is bound, all pointers are treated as offsets. Conceptually, one can imagine that buffer object zero is a buffer object sitting at base NULL and with an extent large enough that it covers all of the system's virtual address space. Note that this approach essentially requires that binding points be client (not server) state.
spencewah
A: 

Calculations based on an offset of zero can work in some compilers, but are not guaranteed to do so. ANSI C offers you the offsetof operator in stddef.h to allow you to calculate offsets of members of a struct.

Tim