I'm running a database-backed web site on shared hosting that occasionally gets swarmed after a mention on a link sharing site.
Because of how much load the first couple of traffic surges put on the database, I have implemented file-based caching.
When a query runs, I just serialize the resultset object and save it to a file. I have a sub-directory structure in the cache directory that keeps thousands of files from ending up in the same directory. Next time I have to run the same query, I just pull the object out of the file instead.
It's been working pretty well so far. But I'm worried that I am overlooking something, and possibly asking for trouble if there is a higher level of traffic than I've previously enjoyed. Or maybe there's just an easier way to do this?
Please poke some holes in this for me? Thanks!