I'm pretty new to caching strategies and implementations. I'm working on a project that will be database intensive, but also have information being updated and changed very regularly.
I've found enough info to know generally how to develop the caching function, but what I'm unsure about is the general strategy.
If I cache all query results and group them by logical things that I can clear on triggers that make sense, I'll probably have tens of thousands (at least) tiny files in my cache. Would it make more sense to cache only large query results?
I know that this is a somewhat hardware specific question, but generally speaking at what volume of files does caching become somewhat pointless? Meaning, if you're loading up the file system with all of these tiny files, does access to them eventually become slow enough that you might as well have just not cached the information to start with?
Thanks all, I'm interested in any opinions you have to offer
EDIT: Based on the responses regarding this being absolutely application specific, let me pose the question this way which should be universal:
Assuming that I have an application that depends on one table with 1,000,000 items in it...
Would it be quicker to do a query to retrieve one of those items directly from the database, or to retrieve one of those items from my cache directory with 1,000,000 files, each containing the details of one of those items?
EDIT: Apparently 100,000 wasn't enough to get a valid answer, let's make it 1,000,000. Anyone want to go for 1,000,000,000? Because I can do it...