Basically for a plugin for a dynamic site (site can be fairly large) I am caching results of some sort of search (because results are from an external search), the results can be 400-1500 characters in length.
As the results come in an array, I use json_encode
(faster than serialize) to store in the database, but ~1.5KB per entry (since there may be 10,000)=15MB seems a little large for me.
My questions are:
* Is this an acceptable (your opinion) size per entry?
* Will running GZip or similar and storing in a binary
field in MySQL be more efficient or take too much CPU time in the end? Anything normally used similar?
I prefer to not use memcached or alike as it's needed to be portable (but would that be better as well?) this is mostly a theory question for me, I just require input before I implement anything solid.