Hi, I'm trying to use cocoa to grab images from a webcam. I'm able to get the image in RGBA format using the QTKit and the didOutputVideoFrame delegate call, and converting the CVImageBuffer to a CIImage and then to a NSBitmapImageRep.
I know my camera grabs natively in YUV, what I want is to get the YUV data directly from the CVImageBuffer, and proccess the YUV frame before displaying it.
My question is: How can I get the YUV data from the CVImageBuffer?
thanks.