views:

57

answers:

1

We are using reverse-geocoding in a rails webservice, and have run into quota problems when using the Google reverse geocoder through geokit. We are also implementing the simple-geo service, and I want to be able to track how many requests per minute/hour we are making.

Any suggestions for tracking our reverse-geocoding calls?

Our code will look something like the following. Would you do any of these?

  • Add a custom logger and process in the background daily
  • Use a super-fantastic gem that I don't know about that does quotas and rating easily
  • Insert into database a call and do queries there.

Note: I don't need the data in real-time, just want to be able to know in an hourly period, what's our usual and max requests per hour. (and total monthly requests)

def use_simplegeo(lat, lng)
  SimpleGeo::Client.set_credentials(SIMPLE_GEO_OAUTHTOKEN, SIMPLE_GEO_OAUTHSECRET)
  # maybe do logging/tracking here?
  nearby_address = SimpleGeo::Client.get_nearby_address(lat, lng)

  located_location = LocatedLocation.new
  located_location.city = nearby_address[:place_name]
  located_location.county = nearby_address[:county_name]
  located_location.state = nearby_address[:state_code]
  located_location.country = nearby_address[:country]
  return located_location

end

Thanks!

+1  A: 

The first part here is not answering the question you are asking but my be helpful if haven't considered it before.

Have you looked at not doing your reverse geocoding using your server (i.e. through Geokit) but instead having this done by the client? In other words some Javascript loaded into the user's browser making Google geocoder API calls on behalf of your service.

If your application could support this approach than this has a number of advantages:

  • You get around the quota problem because your distributed users each have their own daily quota and don't consume yours
  • You don't expend server resources of your own doing this

If you still would like to log your geocoder queries and you are concerned about the performance impact to your primary application database then you might consider one of the following options:

  1. Just create a separate database (or databases) for logging (which write intensive) and do it synchronously. Could be relational but perhaps MongoDB or Redis might work either
  2. Log to the file system (with a custom logger) and then cron these in batches into structured, queriable storage later. The storage could be external such as on Amazon's S3 if that works better.
  3. Just write a record into SimpleGeo each time you do a Geocode and add custom meta-data to those records to tie them back to your own model(s)
bjg
Regarding the javascript -- I'd love to, but this is an API, so the iphones are sending along their lat/lng and can't do the reverse coding. Since we're on Heroku, I don't think I can do the file-system log; so maybe MongoDB might be the way to go.
Jesse Wolgamott
So it's an iPhone app and not a browser running there and I'm guessing that you don't control how the lat/lng stuff is gathered or sent? And you're right about the filesystem on Heroku. But you could use S3 and only pay for the storage (not bandwidth) or use their recently launch MongoHQ add-on
bjg
Right on all accounts! Leaning towards a MongoHQ implementation... Any thoughts on grouping by time period? (getting a requests per second)? would you just Request.all.group_by the second/hour/whatever?
Jesse Wolgamott