This is a question about some openGLES spookiness on iPhone.
I've noticed that the color of pixels on device and the color of pixels in simulator vary slightly. For example, a green pixel might be (0,241,0) in simulator and (0,239,0) on device. This wouldn't be a big issue normally (to the naked eye they look exactly the same) but I'm using pixel data to encode some information and color matching needs to be exact.
I assume I'm overlooking some limitation of the GLES implementation... My framebuffer that receives the rendering is set up using the following commands:
// Generate an offscreen framebuffer/renderbuffer for rendering the hit detection polys // Create framebuffer glGenFramebuffersOES(1, &hitDetectionFbo); glBindFramebufferOES(GL_FRAMEBUFFER_OES, hitDetectionFbo); // Create colorbuffer glGenRenderbuffersOES(1, &colorBuffer); glBindRenderbufferOES(GL_RENDERBUFFER_OES, colorBuffer); glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_RGBA8_OES, SCREEN_WIDTH, SCREEN_HEIGHT); glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, colorBuffer); // Create depthbuffer glGenRenderbuffersOES(1, &depthBuffer); glBindRenderbufferOES(GL_RENDERBUFFER_OES, depthBuffer); glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_DEPTH_COMPONENT24_OES, SCREEN_WIDTH, SCREEN_HEIGHT); glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, depthBuffer);
NSAssert(glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES) == GL_FRAMEBUFFER_COMPLETE_OES, @"Framebuffer is not ready for rendering");
I'd appreciate any help or thoughts, everything works perfectly in simulator (colors match as expected) but I start losing precision on device.
Thanks, -S