views:

84

answers:

1

Does anyone knows some convinient method to capture video to file or stream from OpenGL app on Android device?

For example, can we capture video from a view, opengl view?

I just found out the following: 1) We can get frames using glReadPixels. (No video on this step?) 2) MediaRecorder can encode video, but how can we provide it our raw source, if possible? 3) Any working ports of ffmpeg(for example) or other encoding libraries? There are some tutorials of portng ffmpeg to use withing NDK. So, having raw frames and working port of ffmeg we can create video? Any issues on this step? Anyone managed to port any encoding library successfully? What components do I need from ffpmeg?

Preffered formats are mp4 or flv. Target devices > 2.1. NDK is no problem

A: 

Not an easy answer to your questions but here's some pointers to at least get you going in the right direction.

For capturing video frames (non-gl view) such as from a generic MediaPlayer you would need hardware support and by that I mean a device driver. Nowadays most hardware vendors are providing h/w accelerators into their chips such as a DSP, in order to speed up video and camera use cases. In addition to that there's the user space support (Media Framework and apps) to allow you to access the device driver, so a media framework would be your best bet to accomplish capturing video. I would recommend studying some of the projects in open source such as gst-plugins in Android (http://gitorious.org/rowboat/external-gst-plugins-android/trees/master), that would definitely will give you an idea on how things stack up.

For the gl-view case, you would need some sort of Texture Streaming support in the GL driver. Right now there's no standard way for doing this (i.e., current drivers may support it through a proprietary extension). The use of glReadPixels for capturing frames is not very efficient as there's a flush involved and that will hurt your frame rate.

csanta