I have a service that trawls our website to get alot of data. This takes about 3 days(at a guess as it hasnt managed to run the whole way though yet!) on 5 different threads getting the data and putting it in the database. If the WebClient.DownloadData fails then the error is logged in the database and it is ignored, but at about 3am yesturday morning the service stopped after about 2000 failed web requests. They return a remote server 521 exception or operation timeout. I have an unexpected exception catch on it too but it doesnt get in there! I have been trying to fix this error for weeks now but it is such a long process to get the error to occur! Does anyone have any idea why this keeps happening?
A:
You're going to need to give us more details, I think, and some code fragments,
I think, from your description, you have two problems - one, the web requests start failing with 521 errors, and, two, your service stops.
Given the service stops after a large number of the web request failures, I'd look at resource exhaustion of some sort - are you running out of memory? Filehandles? Threads? Something else? What does your application do when the web request fails?
Paul
2009-04-27 10:34:24
Thanks for accepting this, but does that mean I pointed you in the right direction? What was the issue?
Paul
2009-05-01 09:09:21
A:
Kinda sounds like a memory leak to me or perhaps the call stack is just getting too deep and the spider simply runs out of resources. I ran into a problem like this building very large multidimensional arrays to correlate search words using SVD.
William Edmondson
2009-04-27 12:58:26