views:

168

answers:

1

I need to cache some xml from a webservice to a local xml file for use by my web application.

There are two web services, one which returns the data, and one which returns the date/time the data last changed.

The xml file size could range from 50KB to 500KB

I currently have an executable that runs from a windows scheduler that performs the following.

  • Check DateTime last changed from WebService
  • Check DateTime local xml file was last modified
  • Compare dates to determine if to continue
  • Pull data from webservice into local string
  • Write data to a local file myxml.temp
  • Rename existing live file myxml.xml to myxml.delete
  • Rename myxml.temp file to myxml.xml
  • Delete myxml.delete file

I perform the initial write using FileInfo.CreateText() and a TextWriter.

I perform the renames and deletes using FileInfo.MoveTo and FileInfo.Delete.

This is fairly reliable as the renames are quick, however, we still get a number of access denied errors as files are still in use.

Does anybody have any advice on a more reliable / resiliant way of doing this? Is there anything built in that I may have overlooked for managing a local file like this?

+2  A: 

One option might be to just download each update to a new file name with the timestamp in the file name

c:\downloads\MyXml20090319-1035.xml

Then, in the application that's using the data, you have it pick the latest file from that directory.

You can clean up the older data files at any point, and if they fail to delete now, just retry on the next iteration.

GeekyMonkey