Background
I am capturing video using the video4linux 2 spec. It is captured using a C program in real-time. I also have a Java frontend that can run both locally and remotely. The remote side was easy, I just compress the images to JPEG and ship them over a mini-http server to the client that decompresses them and shows them on the screen.
When we run locally, I would like some way IPC to connect directly to that memory and access the images from Java. Then, blit those to the screen using as little CPU power as possible. This is a "surveillance" type system so I could have 8-16 camera feeds running at a time.
Question
What is the most efficient way to move the image data (YUV420P) from the v4l2 mmap buffer to my Java app to display it on the screen? Please show code or point me to some api/specs if any are available.
Answer
In the interest of time, I decided to just use plain Sockets and send the data in RGB. I was able to drastically improve performance when the Java client is running on the same machine. I'm still sending JPEGs over the network if the client is running remotely. Next, I'll need to find an optimized JPEG decoder.
By the way, this is not 2 clients, just my CameraStream widget reads and parses both types.