views:

1008

answers:

5

Hi !

I am trying to make an application that would detect what kind of shape you made with your iPhone using accelerometer. As an example, if you draw a circle with your hand holding the iPhone, the app would be able to redraw it on the screen. This could also work with squares, or even more complicated shapes. The only example of application I've seen doing such a thing is AirPaint (http://vimeo.com/2276713), but it doesn't seems to be able to do it in real time.

My first try is to apply a low-pass filter on the X and Y parameters from the accelerometer, and to make a pointer move toward these values, proportionally to the size of the screen. But this is clearly not enought, I have a very low accuracy, and if I shake the device it also makes the pointer move...

Any ideas about that ? Do you think accelerometer data is enought to do it ? Or should I consider using other data, such as the compass ?

Thanks in advance !

+2  A: 

You need to look up how acceleration relates to velocity and velocity to position. My mind is having a wee fart at the moment, but I am sure it the integral... you want to intergrate acceleration with respect to time. Wikipedia should help you with the maths and I am sure there is a good library somewhere that can help you out.

Just remember though that the accelerometers are not perfect nor polled fast enough. Really sudden movements may not be picked up that well. But for gently drawing in the air, it should work fine.

thecoshman
OK, I'm going to work on this. Thanks !
super_tomtom
You need to integrate *twice* to get from acceleration to displacement.
Paul R
+3  A: 

OK I have found something that seems to work, but I still have some problems. Here is how I proceed (admiting the device is hold verticaly) :

1 - I have my default x, y, and z values.
2 - I extract the gravity vector from this data using a low pass filter.
3 - I substract the normalized gravity vector from each x, y, and z, and get the movement acceleration.
4 - Then, I integrate this acceleration value with respect to time, so I get the velocity.
5 - I integrate this velocity again with respect to time, and find a position.

All of the below code is into the accelerometer:didAccelerate: delegate of my controller. I am trying to make a ball moving according to the position i found. Here is my code :

NSTimeInterval interval = 0;
NSDate *now = [NSDate date];
if (previousDate != nil)
{
    interval = [now timeIntervalSinceDate:previousDate];
}
previousDate = now;

//Isolating gravity vector
gravity.x = currentAcceleration.x * kFileringFactor + gravity.x * (1.0 - kFileringFactor);
gravity.y = currentAcceleration.y * kFileringFactor + gravity.y * (1.0 - kFileringFactor);
gravity.z = currentAcceleration.z * kFileringFactor + gravity.z * (1.0 - kFileringFactor);
float gravityNorm = sqrt(gravity.x * gravity.x + gravity.y * gravity.y + gravity.z * gravity.z);

//Removing gravity vector from initial acceleration
filteredAcceleration.x = acceleration.x - gravity.x / gravityNorm;
filteredAcceleration.y = acceleration.y - gravity.y / gravityNorm;
filteredAcceleration.z = acceleration.z - gravity.z / gravityNorm;

//Calculating velocity related to time interval
velocity.x = velocity.x + filteredAcceleration.x * interval;
velocity.y = velocity.y + filteredAcceleration.y * interval;
velocity.z = velocity.z + filteredAcceleration.z * interval;

//Finding position
position.x = position.x + velocity.x * interval * 160;
position.y = position.y + velocity.y * interval * 230;

If I execute this, I get quite good values, I mean I can see the acceleration going into positive or negative values according to the movements I make. But when I try to apply that position to my ball view, I can see it is moving, but with a propencity to go more in one direction than the other. This means, for example, if I draw circles with my device, i will see the ball describing curves towards the top-left corner of the screen. Something like that : http://img685.imageshack.us/i/capturedcran20100422133.png/

Do you have any ideas about what is happening ? Thanks in advance !

super_tomtom
+2  A: 

The problem is that you can't integrate acceleration twice to get position. Not without knowing initial position and velocity. Remember the +C term that you added in school when learning about integration? Well by the time you get to position it is a ct+k term. And it is is significant. That's before you consider that the acceleration data you're getting back is quantised and averaged, so you're not actually integrating the actual acceleration of the device. Those errors will end up being large when integrated twice.

Watch the AirPaint demo closely and you'll see exactly this happening, the shapes rendered are significantly different to the shapes moved.

Even devices that have some position and velocity sensing (a Wiimote, for example) have trouble doing gesture recognition. It is a tricky problem that folks pay good money (to companies like AILive, for example) to solve for them.

Having said that, you can probably quite easily distinguish between certain types of gesture, if their large scale characteristics are different. A circle can be detected if the devise has received accelerations in each of six six angle ranges (for example). You could detect between swiping the iphone through the air and shaking it.

To tell the difference between a circle and a square is going to be much more difficult.

Ian
A: 

Hi lan,

Thanks for your reply.

I wonder if I really get what you explained to me. You mean that I need to know the initial position and velocity of the device before starting to integrate accelerometer values. In my code, I decided that the device would start with an approximate velocity and position of zero (this is how I initialized my values, simply with zero). It is probably not a good idea, but the fact is that even if I let my iPhone lying flat on a table, I still get some movement information...

With this calculus, I got an approximate idea of what shape have been drawn by the user, and that's enough for what I'm trying to do. I just wonder how could I filter the accelerometer values so I can get rid of this "default movement behaviour", and always keep centered the positions I have found.

super_tomtom
That's what I mean. You can't filter to get rid of the 'default movement' from an accelerometer, because you have no way of knowing what that might be.AirPaint makes the assumption that the motion will end at its starting point, so it can do some error correction that way. But a general solution to the problem isn't possible on acceleration data at all.The motion you're getting when the device is stationary are also contributed to by the inaccuracies in the accelerometer. When you come to integrate that twice, those inaccuracies are very large.
Ian
You will also have errors caused by slight twisting in the phone during motion. Because you consider low-frequency acceleration to be gravity, changes in tilt will register as dramatic changes in acceleration. If I tilt, then tilt back, it will be equivalent of a big swing.
Ian
+2  A: 

Seems like you are normalizing your gravity vector before subtraction with the instantaneous acceleration. This would keep the relative orientation but remove any relative scale. The latest device I tested (admittedly not an Idevice) returned gravity at roughly -9.8 which is probably calibrated to m/s. Assuming no other acceleration, if you were to normalize this then subtract it from the filtered pass, you would end up with a current accel of -8.8 instead of 0.0f;

2 options: -You can just subtract out the gravity vector after the filter pass -Capture the initial accel vector length, normalize the accel and the gravity vectors, scale the accel vector by the dot of the accel and gravity normals.

Also worth remembering to take the orientation of the device into account.

ScottP