views:

859

answers:

2

I want to write an object tracking program which needs to play the tracked result. As gstreamer is a good multimedia frame work, I want to use it in my demo program. But I don't know how to implement video display in detail. Can any one help?

Avidemux plugin could separate the audio and video part from an avi file, but what to do next?

If I open an uncompressed avi file, does it work if I directly link the output pad of avidemux plugin to a video sink?

By the way, which video sink is better at the efficiency aspect? I looked up the plugin reference and found a lot of video sinks: glimagesink, osxvideosink, sdlvideosink, ximagesink, xvimagesink, dfbvideosink, fbdevsink, gconfvideosink Does autovideosink always work well? My platform is ubuntu 9.04.

TIA~

+2  A: 

I would recommend using the playbin (reference page, documentation) or the decode bin (reference page, documentation). These greatly simplify the process of creating gstreamer pipelines for different types of video files. With each you can use a video sink using the GstXOverlay interface. See Embedding the video window in your application for more details.

Nick Haddad
thanks a lot and I will have a try~
KOFKS
+4  A: 

As Nick Hadded suggests, playbin or decodebin is a good place to start. For experimenting I suggest using gst-launch, e.g.:

gst-launch filesrc location=video.avi !
decodebin2 name=dec ! queue ! ffmpegcolorspace ! autovideosink
dec. ! queue ! audioconvert ! audioresample ! autoaudiosink

I'm using autoaudiosink and autovideosink because they usually work. When you've found a pipeline that works, try building the same pipeline with code. If you don't need audio, just skip that part of the pipeline. For video display, your best bet is xvimagesink (on UNIX at least).

foolip