views:

2132

answers:

3

Do any of the currently popular browsers have particular problems caching* XMLHttpRequest responses that I need to be aware of?

I'd like to be able to include XMLHttpRequest queries on every page as a method of dynamically loading content (ie JSON) or behaviour (like eval()ed Javascript) relevant to the type of page, but wanted to make sure that the resources it receives from the server could be cached, if the server sent the right headers.

I was concerned to read this article which mentions that browsers such as Firefox 1.1 do not cache any content obtained via XMLHTTPRequest, and that it always requests new data is sent completely (with Cache-Control and no If-Modified-Since) regardless of headers sent by the server.

Obviously that article is very old - I don't even remember a Firefox 1.1; so what are the considerations I need to make for current popular browsers and is there any trick for when I specifically want responses to be cached?

*To clarify my question, by caching, I mean client-side caching, where the server issues freshness information (in the form of a Cache-Control: max-age directive or an Expires: header) and the browser stores a copy of the response in its cache along with an expiry date, so that future requests for the same resource issued from subsequent pages can be satisfied from the browser cache without the need for any contact with the server at all. All major browsers do this correctly for most content, but I've heard that Firefox cannot do this for XMLHttpRequest content. What I'm asking is if anyone knows of cases where any of the modern browsers do not cache responses according to the spec when using XMLHttpRequest.

+4  A: 

Although some browsers have different defaults (by default, IE will cache results from AJAX requests, but Firefox, by default, will not), all browsers that I'm aware of will obey the http headers, such as Cache-Control. So just set the caching headers correctly for your application.

Here is an example:

    public ActionResult SomeAction()
    {
        var model = [...];
        Response.AddHeader("Cache-Control", "no-cache");
        return Json(model);
    }

Now IE and Firefox will both behave the same; they will never cache the results of the action.

Craig Stuntz
i wouldn't be so sure. IE6's cache is too aggressive. i've been bitten several times by it. if you don't want queries to be cached, add arandom postfix to the URL (an unused parameter is ok)
Javier
Unfortunately this is the opposite of what I need - I'd like to make sure the response _is_ cached. That is, if the browser can satisfy the request with a previous response from its cache that hasn't expired, I'd like it to do so and prevent and need to make a request to the origin server at all.
thomasrutter
Javier, that will work, but it delegates cache policy to the client, which is IMHO the wrong place for it.
Craig Stuntz
thomas, I understand your needs, but I can't set policy for you. Do read the link I included. Look at the public and max-age values.
Craig Stuntz
Sorry, I probably haven't made myself clear enough. I am fully aware of the HTTP specification including all the ways for the server to specify freshness information. However, not all browsers obey all of it, and I wanted to know if there were any that had problems specifically with XMLHTTPRequest.
thomasrutter
A: 

I have had a few experiences with earlier verions of IE (5.5 - 6) caching ajax GETs (I was using Prototype JS Ajax.Request) and found that changing the request type to POST solved the problem.

I have also read in several places that to be on the safe side, or if there is any uncertainty, to use POST instead.

You can use Firebug to determine if the content is being reloaded or not.

karim79
Using POST to control caching behavior is a bad design, I think. When cache policy is important (usually), caching behavior should be defined by the server, not by the client (browser).
Craig Stuntz
POST is only important if a) the request will perform an action on the server, or b) the request returns sensitive data that you want to protect from CSRF
roryf
We're talking about getting the latest of whatever it is from the server. Maybe an action is performed? And, what problems could substituting a get with a post cause, other than breaking the HTTP specification? I wouldn't call breaking a spec to make life easier and without side-effects bad design.
karim79
Karim, there is a *correct* way to solve this problem.
Craig Stuntz
Craig, it might be correct but if it's less reliable than the hack, then it's an inferior solution.
karim79
Why do you think the correct solution is unreliable, but the bad design will work forever?
Craig Stuntz
Nothing works forever, even cold November rain :)
karim79
+6  A: 

Mark Nottingham has an excellent set of functional tests that demonstrate browser XMLHttpRequest caching behaviour. Load up the page in the browsers you want to support and work out what techniques you can and cannot rely on to have your response cached.

Simon Lieschke
Thanks very much for that. It's looking quite promising, as Firefox did pass those 'freshness' tests (including all the ones with 'Expire').
thomasrutter