tags:

views:

1060

answers:

4

I am working on gestures using acceleration values (x, y, z) from a device.

If I hold the device in my hand in a resting position (x,y,z) = ((0,0,0)). But if I change the direction of device (still at resting position) the values are changed to something like ((766,766,821)). As all the x, y, z axis are changed compared to their original orientations.

Is there any way (trigonometric function OR other) to resolve this issue?

+1  A: 

I find your question unclear. What exactly do you measure and what do you expect?

In general, an accelerometer will, if held in fixed position, measure the gravity of the earth. This is displayed as acceleration upwards, which might sound strange at first but is completely correct: as the gravity is accelerating "down" and the device is in a fixed position some force in the opposite direction, i.e. "up" needs to be applied. The force you need to hold the device in a fixed position is this force, which has a corresponding acceleration in the "up" direction.

Depending on your device this gravity acceleration might be substracted before you get the values in the PC. But, if you turn the acceleratometer, the gravity acceleration is still around and still points to the same "up" direction. If, before turning the acceleratometer, "up" would correspond to x it will correspond to a different axis if turned 90°, say y. Thus, both the measured acceleration on x and y axis will change.

So to answer your question it's necessary to know how your accelerometer presents the values. I doubt that in a resting position the acceleration values measured are (0, 0, 0).

bluebrother
I agree. Rotating an accelerometer from one fixed position to another should result in different output values.
Key
Thanks, The values of accelerometer at resting posion are x =766; y = 766; z =821;where x is horizontal and z is vertical which is under the influence of gravity.
Madni
+1  A: 

Your comment makes your question clearer. What you need to do is calibrate your accelerometer every time the orientation changes. There is no getting around this. You could make it a UI element in your application or if it fits with your uses, recalibrate to 0 if the acceleration is relatively constant for some amount of time (won't work if you measure long accelerations).

Calibration is either built into the device's api (check the documentation) or something you have to do manually. To do it manually, you have to read the current acceleration and store those 3 values. Then whenever you take a reading from the device, subtract those 3 values from each read value.

colithium
Thanks, How we can do the Calibration. I have 3 values when the device is at NOT moving ( i-e x =766; y = 766; z =821 ). I subtract it from current events to make them +- values.The issue is changing the orientation of device the (x,y,z) values are changed. How to make the values orientation free
Madni
You have to recalibrate every time you drastically change the orientation. You can make this automatic using the technique I suggested. As far as I know, most standard accelerometers don't dynamically adjust for orientation.
colithium
+5  A: 

The acceleration due to gravity will always be present. It appears you are subtracting that value from one of the axes when the device is in a particular orientation.

What you will need to do to detect gestures is to detect the tiny difference that momentarily appears from the acceleration due to gravity as the devices begins moving. You won't be able to detect if the device is stationary or moving at a constant velocity, but you will be able to determine if it is turning or being accelerated.

The (x,y,z) values give you a vector, which gives the direction of the acceleration. You can compute the (square of the) length of this vector as x^2 + y^2 + x^2. If this is the same as when the device is at rest, then you know the device is unaccelerated, but in a certain orientation. (Either at rest, or moving at a constant velocity.)

To detect movement, you need to notice the momentary change in the length of this vector as the device begins to move, and again when it is brought to a stop. This change will likely be tiny compared to gravity.

You will need to compare the orientation of the acceleration vector during the movement to determine the direction of the motion. Note that you won't be able to distinguish every gesture. For example, moving the device forward (and stopping there) has the same effect as tilting the device slightly, and then bringing it back to the same orientation.

The easier gestures to detect are those which change the orientation of the device. Other gestures, such as a punching motion, will be harder to detect. They will show up as a change in the length of the acceleration vector, but the amount of change will likely be tiny.

EDIT:

The above discussion is for normalized values of x, y, and z. You will need to determine the values to subtract from the readings to get the vector. From a comment above, it looks like 766 are the "zero" values to subtract. But they might be different for the different axes on your device. Measure the readings with the devices oriented in all six directions. That is get the maximum and minimum values for x, y, and z. The central values should be halfway between the extremes (and hopefully 766).

Certain gestures will have telltale signatures.

Dropping the device will reduce the acceleration vector momentarily, then increase it momentarily as the device is brought to a stop.

Raising the device will increase the vector momentarily, before decreasing it momentarily.

A forward motion will increase the vector momentarily, but tilt it slightly forward, then increase it again momentarily, but tilted backward, as the device is brought to a stop.

Most of the time the length of the vector will equal the acceleration due to gravity.

UncleO
A: 

If the device is not compensating automatically for the gravitational acceleration you need to substract the (0,0,~9.8m/s2) vector from the output of the device.

However, you will also need to have the orientation of the device (Euler angle or Rotation Matrix). If your device isn't providing that it's basically impossible to tell if the signaled acceleration is caused by actually moving the device (linear acc) or by simply rotating it (gravity changing direction).

Your compensated acceleration will become:

 OutputAcc = InputAcc x RotMat - (0,0,9.8)

This way your OutputAcc vecor will always be in a local coord frame (ie. Z is always up)

Radu094
Thank you! I think it will make the device oriantation free.Do you know any implementation of RotMat in C ??Regards
Madni
implementation of RotMat in C / C++
Madni
Google for Vector and Matrix implementation classes in C++ . Getting the code to multiply a vector and a matrix shouldn't be a problem. Getting the actual rotational values from the device is the tricky part.
Radu094
Thanks you !Will you plesaase write the full equation, as i got some ambiguities for OutputAcc = InputAcc x RotMat - (0,0,9.8)lets say only the equation for X Axis acceleration
Madni
Is it Like this Rotation alsong X axis
Madni
X' = X * [1 0 0; 0 cosalpha sinalpha; 0 -sinalpha cosalpha]-[0,0,9.8]
Madni
:where angle = atan ( z / y )
Madni