views:

74

answers:

2

I'm running a database-backed web site on shared hosting that occasionally gets swarmed after a mention on a link sharing site.

Because of how much load the first couple of traffic surges put on the database, I have implemented file-based caching.

When a query runs, I just serialize the resultset object and save it to a file. I have a sub-directory structure in the cache directory that keeps thousands of files from ending up in the same directory. Next time I have to run the same query, I just pull the object out of the file instead.

It's been working pretty well so far. But I'm worried that I am overlooking something, and possibly asking for trouble if there is a higher level of traffic than I've previously enjoyed. Or maybe there's just an easier way to do this?

Please poke some holes in this for me? Thanks!

+2  A: 

Ideally. cache in memory to remove disk access. Have a look at something like memcached

AdaTheDev
Sadly, I can't run memcached on this host. I realize this would be preferable.
i-g
The problem you'll have is you may be reducing actual DB load by caching to files, but you're not minimising disk access which is one area that is big for performance.
AdaTheDev
A: 

Since you're on shared hosting, you should do some throttling (google "Throttling your web server (Oct 00)" for ideas).

A related interesting read (which also mentions Stonehenge::Throttle) is Building a Large-Scale E-commerce site with Apache and mod_perl http://perl.apache.org/docs/tutorials/apps/scale%5Fetoys/etoys.html

__
I'm not really concerned with bandwidth usage, since the site is all text. I just want it to stay responsive.
i-g