views:

1020

answers:

2

i have here a device which can give me gps coordinates. the time intervall i can define. i want to use it to calculate the average speed during driving or travelling by car. actually i used a orthodrome formula to calculate the distance between two points and then divided it by the given time intervall. by the implemenation i followed this term (http://de.wikipedia.org/wiki/Orthodrome#Genauere_Formel_zur_Abstandsberechnung_auf_der_Erde). Unfortunately i could only find a german link, but i think the formula should be understandable in any language ;)

Unfortunately, using this formula and a time interval of 1 second gives very unprecise results. The speed while walking is between 1 km/h and 20km/h.

So I wonder if there is a general reference on how to implement distance calculation between two gps coordinates (I found something similar on SO) and particulary, which is the best time interval to update the GPS coordinates.

+1  A: 

GPS systems can yield instantaneous velocity directly, without interpolating positions. I read somewhere that the velocity reading is actually more accurate than the position reading. What device/system/OS are you using?

On Android, try the android.location.Location.getSpeed() method (along with hasSpeed()) in your LocationListener implementation.

Marcelo Cantos
Im using Android
Roflcoptr
thanks ill test it tonight and report the result here ;)
Roflcoptr
@Marcelo: the velocity reading may or may not be more accurate than the position reading, but you're right, it is a *different* measurement and is not based simply on calculations using the positional measurments. GPS receivers actually use the Doppler shifts of the satellite signals to calculate velocities.
MusiGenesis
+7  A: 

I assume that you're testing this by walking at a constant speed (I think ~5 kph is a normal walking speed) while measuring your GPS position once per second.

The variation that you're seeing in instantaneous speed (the distance between each measured point divided by 1 second) is either due to random variation in the measured GPS position or else you aren't taking your measurements exactly one second apart (or it could be both of these things).

I'm going to assume your measurements are being taken precisely one second apart. Hand-held GPS devices are much less accurate than advertised. While it's often claimed that the devices are accurate to within 10 ft. of the true position, this simply isn't so.

The best way to measure and report the accuracy of a GPS device is to leave it in a place where it can see the satellites and not be rained on, and record a few day's worth of data. You can then use Google Maps to plot the points - I've done this around my house and around the office, which is a good way to give you a sense of scale.

Obviously, if the devices were perfectly accurate, you would see all your measured points in one spot. Or, if the 10 ft. accuracy thing were true, you would see all the points in a little cluster inside a 20 ft. diameter circle.

What you see instead (with every GPS-enabled device I've ever tested) is a combination of relatively small positional scattering (on the order of a few tens of feet) occurring on a scale of a few seconds, and a longer-term "random walk" of the average position which might move 200 or 300 ft. in the course of a day or two. When plotted over your own house, for example, it might look like your PDA wandered over to the neighbor's house, then across the street, then down the street two houses, back towards you etc., all while jittering around 5 or 10 feet here or there like it drank too much coffee.

GPS can be more accurate than this. Surveyors use devices with much more powerful receiver sets (so they get a much more accurate read on the satellite signals), and they leave them in place for days at a time to average successive measurements. Handheld devices have cheap receiver chips and cheap antennas and have to deal with all kinds of signal interference anyway.

Your best bet is to do a running average to calculate your instantaneous speed. Instead of dividing the distance between the current point and the previous point by 1 second, take the last 5 distances between points and divide by 5 seconds (or whatever number of seconds you use). It's important not to just take the difference between the current point and the point 5 seconds ago and divide this distance by 5, as that would miss any non-linear movement.

Update: I noticed in a comment that you're using an Android device. Do you know if it has a built-in GPS receiver? Many (most?) Android devices don't, which means their GPS is not the triangulate-on-the-satellites version of GPS, but the guess-where-I-am-based-on-the-signal-from-the-cell-towers version. This is much less accurate positionally, as I'm sure you could tell from the snarkiness of my description. :)

MusiGenesis
Yes thanks for this great description. Im using an old G1 which has a built in GPS receiver afaik. But is not very accurate. Then im also usgin a htc desire. there the result is better but still not good enough ;) but i think with your advices ill be able to improve my algorithm
Roflcoptr
I just tried to research these devices, and I have to say honestly that I have no idea whether they have built-in GPS receivers or not. It's weird, because either they do or they don't, and it wouldn't be at all hard for the manufacturers to state "yes, our phone has a built-in Sirf-III chipset, and here's why the receiver doesn't need a visible antenna". My guess is this: they made the G1 with a receiver, but quickly found out it didn't work very well and/or drained the battery too quickly, so they just stopped mentioning it (and maybe even stopped putting the receiver in in later models).
MusiGenesis
Irregardless, as I said in my answer the best way to measure a device's accuracy is to let it record its GPS coordinates for a few days and then plot all the points. I would actually appreciate it if you did this with your two devices and then let me know what your results are.
MusiGenesis
*"Instead of dividing the distance between the current point and the previous point by 1 second, take the last 5 distances between points and divide by 5 seconds..."* How is that different? The goal is to calculate average speed.
Beta
"Obviously, if the devices were perfectly accurate, you would see all your measured points in one spot. " when did the military stop adding daily changing random offsets to the civilian signal?
Pete Kirkham
@Pete: that was changed some time in the 90's. It had been turned off temporarily during the Gulf war (1991), and I guess they realized that the world didn't stop spinning and turned it off permanently. These offsets were on the order of 1000 feet or so, so the accuracy problem was massively worse back then than it is now. But it's still pretty bad.
MusiGenesis
@Beta: you don't see the difference between taking one measurement and taking 5 measurements with averaging?
MusiGenesis
Surveyors usually take advantage of the fact that the "random walk" you described produces very similar error vectors for GPS devices which are close to each other, so the difference vector between two GPSes is *much* more accurate than their absolute positions. A base GPS station at a known point + the difference vector produces errors in the range of 1-2 meters.
Rafał Dowgird
@Rafal: that's interesting and it makes total sense. It seems like it must apply only to high-end receivers, though. I wrote software for GPS-enabled PDAs (with real receivers) for one client, and I began the project assuming GPS was accurate to within 3 meters. It was immediately obvious that this wasn't the case, and to prove this to the client we all went out into a field with our PDAs (Garmin M3s) and stood in the same spot and let them record for awhile, then when back inside and plotted all the data. The same devices in the same spot were measuring all over the place ...
MusiGenesis
... and some PDAs appeared to be as much as 200 feet from each other at the same moment (this was with WAAS enabled, too).
MusiGenesis
When the goal is to get the average over, say, 500 measurements, no, I don't see the difference between A) taking the average of the 500 and B) taking the average of every 5 and then averaging the 100 averages. You're describing a decent way to get instantaneous speed (the thing you say it's important not to do would be better) but that's not what Sebi asked for.
Beta