views:

399

answers:

1

I am not able to use MediaPlayer/VideoView to make rtsp to work in Android. So I have created a client to interact with RTSP server, I have succeeded in doing this. I am able to get the video/audio frame from RTSP server (MySpace) in Android. Now I want to play the frames. I have searched OpenCore APIs to play the frames, but didn't get any APIs.

My investigation: There is a class PlayerDriver.c It creates two sinks one audio and other video.

handleSetVideoSurface
handleSetAudioSink

Two objects of type PVPlayerDataSinkPVMFNode are created. I suspect this class has a way to give the stream as input, but I am not getting the definition of this class.

Can you suggest me is there any class I need to look into it?

A: 

The PVPlayerDataSinkPVMFNode is defined in external/opencore/engines/player/include/pv_datasinkpvmfnode.h. But I don't think this is what you need. PlayerDriver class is using this class to configure video output, i.e. surface to display to. The input is still either a stream URL or file.

Ignas Limanauskas