Yes, that's more than an accepted practice, it's a recommended one when dealing with remote resources.
There are general-purpose cache managers in PEAR and other librairies, but in your case a simple homemade solution would work just as well. Something like that:
function get_xml($url, $max_age)
{
$file = '/path/to/cache/dir/' . md5($url);
if (file_exists($file)
&& filemtime($file) >= time() - $max_age)
{
// the cache file exists and is fresh enough
return simplexml_load_file($file);
}
$xml = file_get_contents($url);
file_put_contents($file, $xml);
return simplexml_load_string($xml);
}
Come to think of it, you could use copy()
to retrieve the resource. In most cases it wouldn't make any difference but it's slightly more gentle on PHP's memory manager if the external resource just happen to be very big. But even then, if you're loading a huge XML to memory you have bigger problems than the way you download it :)
function get_xml($url, $max_age)
{
$file = '/path/to/cache/dir/' . md5($url);
if (!file_exists($file)
|| filemtime($file) < time() - $max_age)
{
// the cache file doesn't exists or is not fresh enough
copy($url, $file);
}
return simplexml_load_file($file);
}
Oh, and I almost forgot. There's a better, easier way to do that if you have access to some cron feature. Just set up a cron job that unconditionally download that remote resource every 5 or 10 minutes. Then, let your PHP script unconditionally read from the cache file and not bother about the remote resource at all. This way, there's no "worst case" scenario in terms of latency. Otherwise, everytime your script refreshes your cache, it makes the user wait noticably more than if it got fetched from the cache.