I want to create a "temporary cache lookup" to speed up a file lookup on my webserver.
What I have now is a folder full of images, and if a user requests an image I use a File.Exists(...) to check to see if the image exists. If it doesn't, download it from another server, and either way, redirect the user to it.
The problem with this is that a lot of requests at once cause File.Exists() to hang. I would like to keep a quick and dirty HashSet of filenames known to be in the local folder, so that if the the user requests a file, and it exists in the HashSet, just redirect to it without doing the File.Exists(), if it doesn't exist int he HashSet, do the File.Exists() then add it.
I know that the HashSet would get blown away if the server ever gets restarted, but I'm not worried about that, because it would quickly "rebuild" itself with the most requested images using the above scenario.
The main question is, since this object would be getting called from multiple users, and various requests would all be adding items to the set, would this cause a problem? Up to now all I've used in the static global sense on webservers is DB connection strings or an email address to send alerts to, etc.
Edit with regards to race conditions:
Yes I was thinking of race conditions. But is a race condition even possible on a single HashSet? Since it only allows unique values, wouldnt the second attempt to add a value fail? At which point I would just ignore the error and continue on.