views:

74

answers:

2

I've written an iPhone application that takes input from a user and updates a bitmap context based on the input. After the update, it then updates the display of a UIView. I am accomplishing such by implementing drawRect in a UIView subclass.

Currently I am using Quartz 2D. Given that the updates are occurring multiple times per second, I have noticed that while app performs well in the emulator it is much slower (albeit unusable) on an actual device. I would like to rewrite drawRect to use OpenGL instead (as I have heard this is much more performant?). If you have other ideas, I am all ears.

I have browsed around the web for samples/tutorials on how to simple display a PNG using OpenGL, but most are overkill for what I need. I am looking for a short and sweet sample for quickly taking a PNG file or UIImage and then rendering this within a UIView using OpenGL (accomplishable? does this even conceptually even make sense?) - or a pointer to a documentation for speeding up image rendering using the existing Quartz API's.

Before you decide to reply with a RTFM response - please don't waste my time or yours. I'm not looking for a shortcut, I don't mind doing the homework. I'm just looking for a simple solution/guidance that fits my needs and which has been hard to come by. I am sure someone who is well versed in OpenGL could probably explain what needs to be done and crank this code in 5 minutes. If that's you, it would make my day :).

Any help, guidance, etc. is always appreciated. Thanks.

A: 

A faster way to display an updated bitmap may be to just assign the bitmap context to the CALayer of a UIView (a subview sized and/or transformed to your desired destination rect if required). No drawRect or OpenGL frame render needed.

hotpaw2
Thanks. What would this, specifically, look like? I've tried, unsuccessfully, working with CALayer before.
sanderb
+1  A: 

Create a bitmap context with memory you have allocated yourself

void* data = malloc(width*height*4);

CGContextRef ctx = CGBitmapContextCreate( . . . );

and render whatever you want into that bitmap context. Next create a texture from the data

glGenTextures(1, &texref);
glBindTexture(GL_TEXTURE_2D, texref);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, data);

Now just follow any tutorial on rendering this texture with GL, using Apple's example code on how to render GL into a UIView. Do be careful that you use an RGBA byte order in the bitmap context(or swap the GL call to ARGB). Also, remember that GL only likes textures with dimensions 2^n x 2^n.

With the iPhone's shared memory architecture, data isn't being sent to the graphics card each cycle, so this doesn't incur much of performance penalty.

Hosiers
Even though the system may use a shared memory architecture, benchmarks show glTexImage2D to be far far slower than any memory to memory copy, possibly due to a hidden bitmap to swizzled or tiled texture format conversion process.
hotpaw2
Thanks! Is texref a void* is the sample above? I don't see a declaration. Also, I had read that you can workaround the 2^n x 2^n requirement by replacing the first parameter, GL_TEXTURE_2D, with GL_TEXTURE_RECTANGLE_ARB - is this true? (stack overflow post:http://stackoverflow.com/questions/2713890/opengl-es-rendering-texture-created-from-cgbitmapcontext)
sanderb
@user407680, texref is a GLuint, it is a gl texture name, not a pointer to data.@hotpaw2, I'm pretty sure the iOS GL ES implementation only allows readonly access when you map to graphics memory.
Hosiers