core-video

Getting PIX_FMT_YUYV422 out of libswscale

Hi I'm trying to learn to use the different ffmpeg libs with Cocoa, and I'm trying to get frames to display with help of Core Video. It seems I have gotten the CV callbacks to work, and it gets frames which I try to put in a CVImageBufferRef that I later draw with Core Image. The problem is I'm trying to get PIX_FMT_YUYV422 to work wit...

hardware acceleration / performance and linkage of different macosx graphics apis, frameworks and layers

the more i read about the different type of views/context/rendering backends, the more i get confused. regarding to http://en.wikipedia.org/wiki/Quartz_%28graphics_layer%29 MacOSX offers Quartz (Extreme) as a render-backend which itself is part of Core Graphics. in the Apple docs and in some books too they say that in any case somehow y...

Movie time from QTVisualContext given CVTimeStamp in CAOpenGLLayer rendering method?

I'm using the standard CoreVideo Display Link + QTVisualContext to render a QuickTime movie into an NSOpenGLView subclass. I would now like to synchronize a timeline view with movie playback. The timeline view is implemented as a layer hosting view, hosting a CAOpenGLLayer subclass that renders the timeline. I chose this architecture bec...

use core-image in 3d

hello, i have a working Core Video setup (a frame captured from a USB camera via QTKit) and the current frame is rendered as a texture on an arbitary plane in 3d space in a subclassed NSOpenGLView. so far so good but i would like to use some Core Image filter on this frame. i now have the basic code setup and it renders my unprocessed v...

I need to upload a video file to a web server iphone sdk

I need to upload a video file to a web server iphone sdk. It is just a quicktime movie. ...

How do I convert a CGImage to CMSampleBufferRef?

Hello! I’d like to convert a CGImage to CMSampleBufferRef and append it to a AVAssetWriterInput using the appendSampleBuffer: method. I’ve managed to get the CMSampleBufferRef using the following code, but the appendSampleBuffer: simply returns NO when I supply the resulting CMSampleBufferRef. What am I doing wrong? - (void) appendCGIma...

iPhone: Real-time video color info, focal length, aperture?

Is there any way using AVFoundation and CoreVideo to get color info, aperture and focal length values in real-time? Let me explain. Say when I am shooting video I want to sample the color in a small portion of the screen and output that in RGB values to the screen? Also, I would like to show what the current aperture is set at. Does ...

How can I convert a uiimage or cgimage to a cvpixelbuffer?

I know how to do it the other way around, but cannot figure out how to get a core video pixelbuffer from an image. Bonus points for a code sample showing the use of pixelBufferPools with an AVAssetWriterInputPixelBufferAdaptor. I have not found any samples or any real use case in the docs. ...