views:

95

answers:

1

Hi, I'm experimenting with the sensors of the iPhone. At the moment I visualize the data of the accelerometer by logging it to a file, downloading the file via Xcode and loading the file into my Ruby-Processing app. As I'm playing around a lot this process is getting annoying. It would be cool to have a live stream of the data on the Mac and see what happens in real time.

So what would be the easiest / fastest way to build a prototype for playing around with the data in real time?

I imagine, I could just send the data over the network and have a Cocoa / OpenGL app read an visualize it. What do you think? Could you give me some clues on how to do that? Thanks!

+1  A: 

Take a look at NSNetServices and bonjour. There's some good examples and using bonjour greatly simplifies the discovery and communication process. The hardest part is the socket fu you use to set up your "server," but the examples provide you with all the code you need.

Neil Mix