views:

239

answers:

0

Hi, I want to create a FLV stream generated from images taken from my directx application, to end up on a webpage.

My current plan is (have been) to send screenshots as JPG:s from the dx app, to a client running on Linux. This client converts the JPG:s to a MJPEG stream. And ffmpeg converts the MJPEG stream to FLV - ending up in Flash Player in the browser.

Something like;

  1. run dx app on windows machine, it listens for connection to send screenshot JPG:s to
  2. on linux machine; ./jpg_to_mjpeg_client | ffmpeg -f mjpeg -i - output.flv

I thought the plan was good, but I'm stuck now. ffmpeg doesn't seem to handle the MJPEG stream coming from the client correctly. I used some code I found on the net for creating the MJPEG stream from the JPG:s, and I understand that there are no real specification for the MJPEG format so maybe they don't use the same MJPEG format or something.

Right now I'm sending [size of JPG buffer], [JPG buffer] for every frame from the dx app. I guess I could encode some stream there too somehow, but on the other hand I dont want to waste too much CPU on the rendering machine either.

How would you do it? Any tips are highly appreciated! Libraries/API:s to use, other solutions.. I don't have much experience of video encoding at all, but I know my ways around "general programming" pretty well.

C or C++ is preferred, but Java or Python might be OK too. I want it pretty fast though - it has to be created in real time, one frame from the dx app should end up in the browser as soon as possible :-)

Oh, and in the future, the plan is it should be interactive so that I could communicate with/control the DX app from the webapp in the browser. Might be good to add that information too. Sort of like a web-based VCR and the movie is rendered in real-time from the DX app.

Thanks,