views:

76

answers:

3

Hi, I have an application which receives GPS data from a mobile device as well as receiving co-ordinate data it also provides signal strength from the GSM network. I am trying to plot the points on a map to display areas of good signal strength and areas of poor signal strength. When I have a few points it all works well, the points are retrieved from the database and a square is built around the point with the top left corner 0.5km from the point. I then display the square shapes on the VE map using colour coding for signal strength. The problem is that there may be thousands and thousands of readings and I need a way to average out those readings that are less than 0.5km from each other or I need to build the square (or circle perhaps) in SQL Server and average out the intersections. I have no idea where to begin with this so any pointers to decent articles or some tips would be much appreciated. Thanks.

A: 

You might want to look into Delaunay Triangulation where you can plot X,Y,Z coordinates into a graph. It might be possible, not knowing exactly what you have for points, to use X,Y for the location and then plot the Z as signal strength and create a spike graph. I've only seen c++ examples CodePlex sample but it might be something you can write a SQL function for.

SomeMiscGuy
+1  A: 

One simple and somewhat inaccurate way to do this would be to decrease the granularity of your data. It might not even be inaccurate, depending on how accurate your x, y measurements are.

let's say we have the following data:

x    y   signal_strenth
10.2 5.1 10
10.1 5.3 12
10.3 5.5 8

If we floor the x and y values, we get:

x    y   signal_strenth    
10   5   10
10   5   12
10   5   9

Then we can average those values by the floored x and y to show that we have average signal strength in the rectangle (10, 5) to (11, 6).

Here's the SQL:

select 
 floor(x) as rectangle_xmin, 
 floor(y) as rectangle_ymin, 
 floor(x) + 1 as rectangle_xmax, 
 floor(y) + 1 as rectangle_ymax, 
 avg(signal_strength) as signal_strength
from table 
group by floor(x), floor(y);

Now, admittedly, you'd ideally want to group data points by distance from point to point, and this groups them by a maximum distance that varies from 1 and to square_root(2) =~1.44, flooring them into rectangular blocks. So it's less than ideal. But it may work well enough for you, especially if the flooring/grouping is less than the error in your measurement of position.

If floor() is not granular enough, you can use floor( x * someweight) / someweight to adjust it to the granularity you want. And of course you can use ceil() or round() to do the same thing.

The whole point is to collapse a bunch of nearby measurements to one "measurement", and then take the average of the collapsed values.

tpdi
This isn't a bad idea at all... like you say it's not the ideal solution but it may work. As these are GPS co-ordinates floor(x) would not be granular enough but floor(x * 100) might be. I'll give it a go and see what the data looks like.
Phill
Ok, it works. There is some accuracy lost but it greatly reduces the amount of data for when the devices aren't moving so that Virtual Earth can keep up. Thanks! I've altered it slightly so I'll post it here for anyone who comes across this.
Phill
A: 
SELECT
geography::STPointFromText('POINT(' + CONVERT(varchar, AvgSignalReadings.rect_lngmin / 100) + ' ' + CONVERT(varchar, AvgSignalReadings.rect_latmin / 100) + ')', 4326) as Location,
AvgSignalReadings.lat / 100 as Latitude,
AvgSignalReadings.lng / 100 as Longitude,
AvgSignalReadings.SignalStrength
FROM
(
SELECT 
    FLOOR(l.Latitude * 100) as lat, 
    FLOOR(l.Longitude * 100) as lng, 
    AVG(l.SignalStrength) as SignalStrength,
    COUNT(*) as NumberOfReadings
FROM SignalLog l
WHERE l.SignalStrength IS NOT NULL AND l.SignalStrength <> 0 AND l.Location IS NOT NULL 
AND l.[Timestamp] > DATEADD(month, -1, GETDATE())
GROUP BY FLOOR(l.Latitude * 100), FLOOR(l.Longitude * 100))
AS AvgSignalReadings
Phill