I have a render method as such:
[EAGLContext setCurrentContext:context];
glBindFramebufferOES(GL_FRAMEBUFFER_OES, defaultFramebuffer);
glViewport(0, 0, backingWidth, backingHeight);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrthof(0, backingWidth, backingHeight, 0, -1.0f, 1.0f);
glTranslatef(0.5f, 0.5f, 0.0f);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glClearColor(0.0f, 1.0f, 0.0f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT);
glVertexPointer(2, GL_FLOAT, 0, Vertices);
glEnableClientState(GL_VERTEX_ARRAY);
glColorPointer(4, GL_UNSIGNED_BYTE, 0, Colors);
glEnableClientState(GL_COLOR_ARRAY);
glDrawArrays(GL_POINTS, 0, [grid resolution]);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, colorRenderbuffer);
[context presentRenderbuffer:GL_RENDERBUFFER_OES];
And it works just fine plotting the pixels on the screen with an even-by-even resolution, but say my X resolution is 233, it seems to render what would be either white or black only pixels to a greyscale version that isn't what I want. It seems to try and stretch the available points to fit an even resolution for some reason and I have no idea why