We have a pretty large website, which handles a couple million visitors a month, and we're currently developing an API for some of our partners. In the near future, we hope to expose the API to all of our visitors.
As we are trying to limit the number of requests to say, 100.000 / day and 1.000 / minute, I need to log the total number of requests per API key, and verify that the API user doesn't exceed this total. We won't check real-time whether the limit is exceeded at this point, but afterwords in the site's control panel. We also need to display a timeline per user in the control panel, so we have to get a quick per day or per hour overview if we need this.
My first idea was to build the following app:
API User => Webserver => Posts message with API key to a message queue => Service picks up the message => Posts to the database, where there is 1 item for each user-hour combo (key|hour|count). Which is gonna be quite fast, yet we'll remove quite some useful information (queries, requests / minute, etc.) Saving each and every request as separate record in the database will likely generate millions of records a day, and will (I guess, I'm not that much of a DBA) be quite slow when generating some chart. Even with the correct indices.
Our platform consists of around ten webservers, ten front end SQL servers, statsserver, some other servers for processing large tasks. All Windows (except our EMC), and MS SQL. Dev platform is ASP.Net WCF.