Well, as for your question, there are three options:
Don't cache that particular piece of information. But from your description, it sounds like you're doing that already and it's not working out.
Use a write through cache. Whenever you "add" or "remove" a post, also generate the cached data for that item. This may be hard depending on how your caching system works, but it's an option.
Invalidate the cache on write. Whenever to "add" or "remove" a post, remove the cached data for these posts. That way, on the next request for these from cache, the cache will be populated. This may or may not be easy, since it may not be trivial to detect what cached items reference a db node (It may require adding more information, or meta cache objects to keep track of this data).
Now, with that said, I find it kind of strange that one query is becoming a performance problem. What's more likely, is that the one query in question needs to be optimized, or your MySQL server needs to be optimized. I regularly see in excess of 5k queries per second on some of my production servers, and have no issues with load. That's because those queries are very efficient, and I have Query Caching turned on in MySQL (it's more effective than you would think)...