views:

113

answers:

2

i can get individual frames from the iPhone's cameras just fine. what i need is a way to package them up with sound for streaming TO the server. sending the files once i have them isn't much of an issue. its the generation of the files for streaming that i am having problems with. i've been trying to get FFMpeg to work without much luck.

anyone have any ideas on how i can pull this off? i would like a known working API or instructions on getting FFMpeg to compile properly in an iPhone app.

A: 

I'm not quite sure what you mean. Do you mean the server can watch your video from your iPhone. Or is there another user that watches your iPhone's video via your server?

Tim van Elsloo
I think he wants the Phone's video to be streamed to his web server and then the web server will do what it needs to do with said files once they are received.
Tegeril
Correct Tegeril
iHorse
Okay, why don't you create a socket-stream to your server (using AsyncSocket.h) and stream all the bytes from the file to your server. On your server you create a socket too (in Java, C, PHP, or something) where the iPhone can connect too. Then you're able to save the bytes at the server :)
Tim van Elsloo
+2  A: 

You could divide your recording to separate files with a length of say, 10sec, then send them separately. If you use AVCaptureSession's beginConfiguration and commitConfiguration methods to batch your output change you shouldn't drop any frames between the files. This has many advantages over frame by frame upload:

  • The files can be directly used for HTTP live streaming without any server side processing.
  • The gap between data transfers allow the antennas to sleep in between if the connection is fast enough, saving battery life.
  • Conversely, if the connection is slow so upload is slower than recording, managing delayed upload of a set of files is much easier than a stream of bytes.
Mo
Since iHorse found a solution to his issue, enjoy your added rep, proved to be good info for me :)
Tegeril