tags:

views:

304

answers:

3

Hello guys,

I saw at least 6 apps in AppStore that take photos when detect motion (i.e. a kind of Spy stuff). Does anybody know what is a general way to do such thing using iPhone SDK?

I guess their apps take photos each X seconds and compare current image with previous to determine if there any difference (read "motion"). Any better ideas?

Thank you!

A: 

You could have the phone detect light changes meaning using the sensor at the top front of the phone. I just don't know how you would access that part of the phone

Jaba
A: 

You could probably also use the microphone to detect noise. That's actually how many security system motion detectors work - but they listen in on ultrasonic sound waves. The success of this greatly depends on the iPhone's mic sensitivity and what sort of API access you have to the signal. If the mic's not sensitive enough, listening for regular human-hearing-range noise might be good enough for your needs (although this isn't "true" motion-detection).

As for images - look into using some sort of string-edit-distance algorithm, but for images. Something that takes a picture every X amount of time, and compares it to the previous image taken. If the images are too different (edit distance too big), then the alarm sounds. This will account for slow changes in daylight, and will probably work better than taking a single reference image at the beginning of the surveillance period and then comparing all other images to that.

If you combine these two methods (image and sound), it may get you what you need.

Mike Cialowicz
Quick note: be flexible with your image-edit-distance. All digital cameras have noise, which makes two seemingly identical photos actually rather different at a bit-level. If there's not much available light, the camera will automatically increase the ISO sensitivity, creating more noise. If you're not weary of this, your app may work well in the daytime, but trigger false alarms at night.
Mike Cialowicz
A: 

I think you've about got it figured out- the phone probably keeps images where the delta between image B and image A is over some predefined threshold.

You'd have to find an image library written in Objective-C in order to do the analysis.

Dave Swersky
does anybody know of such library?
coneybeare