Hi, I'm experimenting with the sensors of the iPhone. At the moment I visualize the data of the accelerometer by logging it to a file, downloading the file via Xcode and loading the file into my Ruby-Processing app. As I'm playing around a lot this process is getting annoying. It would be cool to have a live stream of the data on the Mac and see what happens in real time.
So what would be the easiest / fastest way to build a prototype for playing around with the data in real time?
I imagine, I could just send the data over the network and have a Cocoa / OpenGL app read an visualize it. What do you think? Could you give me some clues on how to do that? Thanks!