Hi, I'm working on a web-app, in which each user has multiple non-local XML files that are downloaded and parsed via SimpleXML each page (re)load. Each request takes a little less than a second on average, however with more than five or six files (which is likely), the load time is quite noticeable. So my question is what factors go into the speed of the function and are there any was I could control and speed them up?
As for purely improving efficiency, http headers to check for the last-updated time are inaccurate and I don't think cron jobs would give me the 'live' results which I'm looking to get.
So are the factors mainly on the server that I'm accessing them from, or are they on my side? Is it the function itself? Obviously the size of the file affects the speed. Is there any way to compress it before bringing it over, thus speeding up downloads?