Hi,
I have website which connects to 14-17 xml streams and downloads streams with every page load. That was fine for testing purposes and for traffic <100/day.
However it now gets to stage when page loads slower and slower because I need to wait for external XML to download.
What is the best way to cache/manipulate with these files? These data are being constantly updated, so rather than downloading XML streams with every visit I would like to have some robot doing this every 5 minutes locally and visitors to read only from local copies.
What methods would you use to make sure that XML files wont get locked or there are no zero results whilst downloading new results?
Thanks.