tags:

views:

243

answers:

4

I wrote a rather small skeleton for my web apps and thought that I would also add a small cache for it.

It is rather simple:

  • If the current page exists as a file in the cache and the file isn't too old, read it out and exit instead of rebuilding the page

  • If the current page isn't cached/outdated recalc the page and save it

However, the bad thing about it is:

  • My performance tests with a page that receives 40 relatively long posts via a MySQL query said that with using the cache, it took even longer to handle a single request (1000 tests each)

  • How can that happen?

  • How can doing a MySQL query, looping through the results the first time, passing the results to the template and then looping through the results for the second time be faster than a filemtime() check and a readout?

  • Should I just remove the complete raw-PHP cache and relieve on the availability of some PHP cache like memcached or so?

A: 

I think not.

The slowest part of the application is probably the database traffic.

If you include an HTML / PHP cached file it should be faster.

FrankBr
You think not? I tested it and a mysql query actually is faster in this case - that's the problem. I even tested it by adding more than 200 entries to the database each containing about 100 characters distributed over a varchar and text field.
adsads
If you could post some code. It would be easier to answer.
Sinan
If DB requests don't take as long as it takes to read from the cache, then building from the cache will not be faster. No matter what.
mattbasta
A: 

I don't think so, there seems to be some other problem depending on your implementation. Here are couple of great resources more about it:

http://www.mnot.net/cache_docs/

http://blog.digitalstruct.com/2008/02/27/php-performance-series-caching-techniques/

Sarfraz
+2  A: 

Premature optimization is the root of all evil. If you don't need a cache, don't use a cache.

That being said, if you are content to not serve up dynamic content per request, you might want to look into using a caching proxy such as varnish and cutting out PHP and the webserver entirely. There's quite a bit of overhead to get to even your first line of PHP, and serving static files through PHP is a little dirty.

If you just want to cache elements, something like memcached or APC's cache is the way to go. APC has the advantage of being more readily available (you should have APC installed on your servers for the opcode cache if you care at all about performance) and memcached has the option of letting you have a cache that's accessible by multiple webservers (and/or multiple caches)

Daniel Papasian
A: 

Possible -

If you are using apc and mysql query cache (default on) , then your php code is already executed and stored as opcode in apc and if you hit the same query repeatedly then the mysql query cache will cache the database results too. In this case most of the data comes from memory so your file read could be slower. The real benefit of this approach is that you will be saving the number of mysql connections but may not be performance.

Using a caching proxy like squid should be the ideal solution in your case so that the page is served directly from cache till it expires. Another optimization for the above situation would be to just cache the mysql output in memcache which would be better than hitting mysql and query cache is small in size. Lastly if you indeed want to cache the mark up generated, you could use output buffering (ob_start) and store the output directly in memcache.

jayadev