views:

44

answers:

0

I'm grabbing frame images from the iPhone's camera at a rate of 25fps using a resolution of 192 x 144 and a 420v, BGRA format.

I'm converting the CVImageBufferRefs into UIImages and then calling UIImageJPEGRepresenation(image, compressionQuality) to get a compressed JPEG version of the image.

Using the Time Profiler in Instruments, I can see that 75% of my CPU time is spent getting the JPEG representation of the image, causing slow down with the other operations I need to accomplish in the app.

It fluctuates a little, spending less time if I set the compression to 1.0 (i.e., no compression) and spending more if I set it to 0.0 (i.e. full compression).

Is there a more efficient way to get a JPEG representation of an image from the iPhone's camera?

Can I get a JPEG representation without converting the CVImageBufferRef to a UIImage (and therefore cutting out a rather expensive Core Graphics drawing operation)?