views:

92

answers:

0

First some context, I'm using OpenGL/ES exclusively for 2d drawing (eg. glOrtho(0, width, 0, height, -1, 1);). I'm drawing arrays of vertices with glVertexPointer()/glDrawArrays() On OS X vertex coordinates using GLfloat represent exact pixel coordinate values. On iOS I have to "scale" those same GLfloats using the following function (recommended here):

GLfloat glfscale(int x)
{
    const GLfloat scale = 1.0f / 65536.0f;
    return (GLfloat)scale*x;
}

To enable drawing code reuse the same function is defined on OS X as:

GLfloat glfscale(int x)
{
    return (GLfloat)x;
}

Is there an OpenGL flag being set (by a default NSOpenGLView method perhaps) that I can duplicate in my iOS code to reconcile the two behaviors?

And a related question, is there a way to peak inside the black box that is the OpenGL state machine to better diagnose issues like this? Other than already knowing what flag you're interested in and using glGet()?


Figured out the answer to my first question. I was using glOrthox() on iOS (which played nice with GLfixed vertex coordinates which I was using before I made the switch to GLfloat). Changing to glOrthof() on iOS reconciles the behavior across OpenGL versions.