Why would the following code request an RSS feed from the corresponding Twitter account more than 4 times per hour (because that's what Twitter says it's doing)? In handling this data, I'm using php's simpleXML and referencing $local_file_twitter. The handling script makes no direct requests to the Twitter feed itself. the Cache folder has 777 access, and the writing of the file IS occurring. Everything works, but Twitter is logging a lot more hits from my server's ip than aught be generated by my caching mechanism. Anything I'm missing/overlooking?
$local_file_twitter = $_SERVER['DOCUMENT_ROOT'] . '/cache/<!-- Any Given File Name -->.rss';
if (is_file($local_file_twitter)) {
//Find out how many seconds it has been since the file was last updated
$time_lapse_twitter = (strtotime("now") - filemtime($local_file_twitter));
//If it has been more than 15 minutes, update the local feed
if ($time_lapse_twitter > 900) {
//Grab the feed from Twitter
$feed_grab_twitter = file_get_contents('http://twitter.com/statuses/user_timeline/<!-- Twitter ID Goes Here -->.rss');
//Save Retrieved Feed as Cached File
file_put_contents($local_file_twitter, $feed_grab_twitter);
}
}
else {
//Grab the feed from Twitter
$feed_grab_twitter = file_get_contents('http://twitter.com/statuses/user_timeline/<!-- Twitter ID Goes Here -->.rss');
//Save Retrieved Feed as Cached File
file_put_contents($local_file_twitter, $feed_grab_twitter);
}