My hardware (video capture card) gives me the images in YV12 (YUV 420) format and I am trying to generate a video from it. I am using C++ under windows and I would like to generate a mpeg-4 VBR video from that stream but I dont know where I should start... (I need it to be VBR because it is a security camera and there will be a lot of repeated frames) Is there any library that does something like this?
A:
ffmpeg will do this for you. Check out this part of the documentation where they talk about encoding raw YUV 420P frames. You can use ffmpeg's built-in mpeg 4 encoder, or it also allows you to interface with other libraries like xvid and x264.
The easiest way to handle this would be to just use the command line ffmpeg executable and just call it from your C++ program. Another option is to use libavformat and libavcodec (ffmpeg libraries). This will require more work, but give you more control over the process (for instance, if you need to do any processing of the video data).
Jason
2010-04-27 15:23:07
I can store all the frames as a RAW YUV file and after compress it using ffmpeg but it is not exactly what I wanted. I would like to stream YUV frames to the library without having to store it on a physical drive. I guess it is not easy to do this on real time...Thanks
zitronic
2010-04-29 07:16:10
The ffmpeg docs say you can use pipes as input using `-f yuv4mpegpipe` as an option on the command line, but I have never tried it. Another option is to use libavcodec and libavformat directly. See this example: http://cekirdek.pardus.org.tr/~ismail/ffmpeg-docs/api-example_8c-source.html, specifically the `video_encode_example()` function. Look for the comment `/*Prepare dummy image*/` and you can replace that with your own data.
Jason
2010-04-29 22:32:49