views:

272

answers:

1

I would like to simulate some kind of camera on a UAV. The camera should provide a live stream, and send the stream over a network connection to a server. The server should be able to play the stream on the fly for me to see.

I was thinking the client(uav) just read a moviefile and sent it to the server. But how can the server show the file on the fly? I suppose the simplest way would be to use xine or mplayer to show the movie? But how?

This is to be done in python and GNU/Linux. The client and server is both located on the same machine.

The main issue is to get the server to play the file on the fly, before it has the whole file available. Any ideas?

EDIT: The server and client are connected with a standard tcp/ip connection. The video feed is not alone on the connection.

Orjanp---

A: 

try 'webcam'

sudo apt-get install webcam

on debian.

It will grab images from a USB camera and put them in a jpg file in /var/www/ then you make an html page that auto-refreshes as fast as possible that points to the image file.

I know its not a very elegant solution but its the only one I know of.

If you really are bent on writing it yourself, You will need to read data from /dev/video0 (probably) and maybe encode it according to a standard video format, open a socket with the client process, and write the video data to the socket. There are some rules for the proper way to stream data over a socket though.

Nathan
check out http://en.wikipedia.org/wiki/Real-time_Streaming_Protocol
Nathan
this is also very informative http://www.jejik.com/articles/2007/01/streaming_audio_over_tcp_with_python-gstreamer/
Nathan
I guess using stationary images would be easier and just use feh to show the image while the next one is downloaded. If I was to use a video, it would already be encoded. So I the would not have to encode anything
Orjanp
I decided to go for the webcam solution, even looking at myself all day long was not so very interesting.
Orjanp