views:

1061

answers:

1

I work at a college that uses an intranet based student management solution not developed by us.

Recently changes were made that caused us to have to set internet explorer to check for a new version of a web page each time it visits. Otherwise certain pages would fail to run correctly we would get old content which would then cause transaction errors. Basically a pain especially as it took a while to work out what was happening and sort out a fix for all users with locked down accounts.

Anyway how will this affect caching of web pages, will content always now be redownloaded or will the cache still work in most part?

+1  A: 

It sounds like you don't have control over adding an Expires Header to the page during the response to the client. If you did, then you could explicitly control how long a page would be cached on the client. However, the client can override the Expires Header by changing a setting in the browser. This is what you did. If you configured the browser to always fetch a new version of the page, then the browser will not cache anything. You can adjust how much IE does cache, but the better solution would be to have the Server set the Expires Header.

Unfortunately can't adjust headers it is a locked down system at least in terms of the front end. They are blaming it on an internet explorer bug wih xml files not being redownloaded, instead cached weather true I do not know.
PeteT
I am mainly interested in image caching, seems a waste to put this option on for an intranet page which causes all internet images to be redownloaded from then on.
PeteT