views:

36

answers:

1

I am planning to write an application (C/C++/Objective-C) that will play media files in own (private) container format. The files will contain: multiple video streams, encoded by a video codec (like XVid or H264, it is supposed, that components capable of decoding the viideo formats are present in the system); multiple audio streams in some compressed formats (it is supposed, that decoding will be performed by a system component or by own code).

So, it seems it is required to implement the following scheme:

1) Implement container demuxer (may be in the form of media handler component).

2) Pass video frames to a video decoder component, and mix decompressed frames (using some own rules).

3) Pass audio data to an audio decoder component, or decompress the audio by own code, and mix decoded audio data.

4) Render video frames to a window.

5) Pass audio data to a selected audio board.

Could anybody provide tips to any of the above stages, that is: toolkits, that I should use; useful samples; may be names of functions to be used; may be improvements to the scheme,....