views:

171

answers:

1

Hi, I'm trying to use cocoa to grab images from a webcam. I'm able to get the image in RGBA format using the QTKit and the didOutputVideoFrame delegate call, and converting the CVImageBuffer to a CIImage and then to a NSBitmapImageRep.

I know my camera grabs natively in YUV, what I want is to get the YUV data directly from the CVImageBuffer, and proccess the YUV frame before displaying it.

My question is: How can I get the YUV data from the CVImageBuffer?

thanks.

+1  A: 

You might be able to create a CIImage from the buffer using +[CIImage imageWithCVBuffer:] and then render that CIImage into a CGBitmapContext of the desired pixel format.

Note, I have not tested this solution.

Barry Wark
I'll try this out...
jslap