views:

9

answers:

0

Hi.

This is more of a general question than a specific, however I am looking for any input and advice.

I'm looking to spoof an IIDC camera, packetize and stream uncompressed 2VUY video data over a firewire/1394 port to another machine. I've been able to do something similar to stream DV packets, but using higher level libraries (I'm on OS X, so I'm able use Apples provided libraries and sample code from the FW SDK as a basis*). Im not much of a low level programmer (more a graphics/GL programmer), so all of this lower level stuff is a bit new to me.

Why do I want to do this, whats my goal? I want to be able to spoof a camera so I can send video from Applications over a DCAM/IIDC stream, in uncompressed, 2vuy (4:2:2 "Y'CbCr format "component YUV") from OpenGL to another computer, so its seen as a valid video input / camera that it can ingest and do things with. I'm a programmer and a VJ, I write Open Source video effects software for the Mac, and this this could be a cheap, portable, and easy solution to mix video between computers.**

I've looked for examples of writing IIDC camera streams, but have found none. I've seen quite a few libraries for reading various IIDC camera inputs and getting a pixel/image buffer out of it, but I want to go the other direction. I'm curious if anyone has any information on how to go about doing it.

I know, probably could do a shit ton of work and literally reverse something like libdc1394, but part of the issue would be writing the proper firewire packets, advertising camera capabilities, etc, none of which those libraries do (to my knowledge). So Im curious if something exists out there that may be able to help bootstrap this project.

If anyone has any pointers, or is familiar with this sort of an endeavor, i'd be super appreciative of any information. I did get a note from one person that they do something like this as a debug setup to test their companies digitizers, but all of their code was proprietary and not available to the public :(.

Thanks again for any info - super curious about this :)

*I was actually fortunate enough to go to WWDC this year, and I was able to ask the Apple FW team about this. I got odd looks, but confirmation its possible, but that it would be a total "roll it yourself" situation, with little in the available high level SDK to be helpful.

** There are no real cheap, portable HD capable video mixers, or capture cards that don't have issues for VJs. I'm aware of just about all of them, and they all have gotchas. While this is a software solution, and does have issues due to requiring OpenGL readback, its doable and can be fast assuming you buffer PBO downloads (which yes, adds latency, but its worth it for resolution and FPS), anyway!