I have a site with millions of URLs. Each time a URL is clicked, a database row corresponding to that URL is updated indicating the timestamp of that click. I would like to, using additional columns for sure, but without the need to insert distinct rows for every click, estimate the number of clicks per hour this URL receives. Some ideas include storing a handful of timestamps that are aligned towards the most recent second, minute, 15 minute and hour intervals (but that idea is fuzzy to me, how that actually gets what we want), or the more nasty solution of serializing a "log" of time deltas in some kind of serialized row.
While a naive approach suggests to measure the time between the current click and the last one to determine a rate, that would only produce a useful estimate if the link is clicked at a very consistent rate. In reality the link could receive a flurry of clicks in one minute and nothing at all for another 20.
the reason I don't want to log each click distinctly is just so that the database isn't weighed down with thousands of additional INSERT statements per hour (and the corresponding DELETEs of data more than an hour old), or alternatively that I don't have to fire up an additional storage system (tokyo tyrant, grepping apache logs, etc.) to log these clicks.