views:

333

answers:

2

I'm retreiving images from a web server directory like this:

        WebClient webClientImgDownloader = new WebClient();
        webClientImgDownloader.OpenReadCompleted += new OpenReadCompletedEventHandler(webClientImgDownloader_OpenReadCompleted);
        if(uriIndex < uris.Count())
            webClientImgDownloader.OpenReadAsync(new Uri(uris[uriIndex], UriKind.Absolute));

But I've noticed if I remove the image, silverlight continues to retrieve the image as if it were there.

When I then type the image URL into FireFox I see the image as well, but then I click Reload and it gives me the appropriate error that the image doesn't exist. Then when I run my silverlight application again, it also appropriately gives me an error that the image doesn't exist as if the browser had cleared a cache flag somewhere.

How can I then do a "refresh" via WebClient in code, so that if an image suddenly doesn't exist on the server, Silverlight doesn't continue to give me a cached copy of it?

+5  A: 

This is a tricky one as the caching is usually being caused by the website's headers not specifying a no-cache. I've found that in the past the easiest way to deal with these caching issues is simply to provide a randomised query string parameters so that the web server interprets each request as a fresh request.

if you're currently requesting www.domain.com/image.jpg then try www.domain.com/image.jpg?rand=XXXX where XXXX is a random value generated in your server side code.

Brian Scott
The problem with a randomised query string is that it will always force a download of the resource because from HTTP's point of view its a different resource, it has no way to determine that the current cached version is the current uptodate version and can be reused. This can deterimentally affect the performance of an application.
AnthonyWJones
+1 for randomized query string idea, as long as performance isn't an issue as noted, this would seem to force it to always give you an accurate response, and you could make this a setting so that while you are developing, it includes the randomized string, then when your site goes live and the data is set, you can remove it for better performance
Edward Tanguay
Edward, that's exactly what I tend to do. I generally use a String.Format("{0}{1}", url, (HttpContext.Current.IsDebuggingEnabled) ? randomParam : String.Empty). I've found this helps me get around IIS caching of non ISAPI registered elements such as js files etc.
Brian Scott
@Brian: If it works, it works. Personally I find it an ugly hack worth avoiding if possible. However you are right, if you don't have control of the images source site and you don't trust the sites own caching spec (or they haven't provided one) then you may have to resort to this.
AnthonyWJones
+2  A: 

You need to decide what you're caching policy is for various content on your site.

If you must make sure that the latest state is presented when ever a request is made ensure that the server configures the response headers appropiately. In this case make sure you have the header Cache-Control: max-age=0 specified on the image (or more likely on the folder holding the set of images).

By setting max-age=0 you will cause the browser to attempt to refetch the image, however it will inform the server about any existing version of the image it has in the cache. This gives the server the opportunity to send status 404 because the image has been deleted, 304 because the image is still there and hasn't changed so the cached version may be used or 200 because the image has changed, this latter response will carry a the new version.

AnthonyWJones
This kind of assumes that he has control over the server that he is requesting from. I'd imagine that given he is using a webclient to get the images that it's more likely that he's pulling them down using the WebClient specifically because they are remote / outwith his control?
Brian Scott