tags:

views:

80

answers:

3

I have some small sets of data from the database (mysql) who are seldom updated.
Basically 3 or 4 small bi dimensional arrays (50-200 items).
This is the ideal case for memcached, but I'm on a shared server and can't install anything.
I only have PHP and MySQL.

I'm thinking about storing the arrays on file and regenerate the file via a cron job every 2-3 hours.

Any better idea or suggestion about this approach?
What's the best way to store those arrays?

+1  A: 

As said in the comments, it would be better to check whether the root of the problem can't be fixed first. A roundtrip that long sounds like a network configuration problem.

Otherwise, if the DB simply is that slow, nothing speaks against a filesystem based cache. You could turn each query into an md5() hash, and use that as a file name. Serialize() the result set into the file and fetch it from there. Use filemtime() to determine whether the cache file is older than x hours. If it is, regenerate the query - or in fact, to avoid locking problems on the cache files, use a cron job to regenerate it.

Just note that this way, you would be dealing with whole result sets that you have to load into your script's memory all at once. You wouldn't have the advantage of being able to query a result set row by row. This can be done too in a cached way, but it's more complicated.

Pekka
I think I'll use an hybrid of yours, carlos and Josh answers :-)
The Disintegrator
+1  A: 

My english is not good, sorry.

Some times I have read about any alternative to memcache. Is complex, but I think that you can use http://www.php.net/manual/en/ref.sem.php acceding to shared memory.

A simple class example used for storing data is here: http://apuntesytrucosdeprogramacion.blogspot.com/2007/12/php-variables-en-memoria-compartida.html

Is written in spanish, sorry, but the code is easy to understand (Eliminar=delete)

I never have test this code!! and I don't know if it's viable in a shared server.

Carlos
Mi ingles tampoco debe ser muy bueno, a juzgar por la cantidad de comentarios que necesite poner para que entiendan de que hablaba...Lo mirare mas tarde, ahora necesito dormir un poco, gracias
The Disintegrator
+1  A: 

If you're working with an overworked MySQL server then yes, cache that data into a file. Then you have two ways to update your cache: either via a cron job, unconditionally, every N minutes (I wouldn't update it less frequently than every hour) or everytime the data changes. The best approach depends on your specific situation. In general, the cron job way is the simplest but the on-change way pretty much guarantees that you won't ever use stale data.

As for the storage format, you could just serialize() the array and save the string to a file. With big arrays, unserialize() is faster than a big array(...) declaration.

Josh Davis
Thanks for the tips, I'll look into that
The Disintegrator