views:

1184

answers:

2

I am batch uploading products to a database.

I am download the image urls to the site to be used for the products.

The code I written works fine for the first 25 iterations (always that number for some reason), but then throws me a System.Net.WebException "The operation has timed out".

        if(!File.Exists(localFilename))
        {
            using (WebClient Client = new WebClient())
            {
                Client.DownloadFile(remoteFilename, localFilename);
            }
        }

I checked the remote url it was requesting and it is a valid image url that returns an image.

Also, when I step through it with the debugger, I don't get the timeout error.

HELP! ;)

+3  A: 

If I were in your shoes, here's a few possibilities I'd investigate:

  • if you're running this code from multiple threads, you may be bumping up against the System.Net.ServicePointManager.DefaultConnectionLimit property. Try increasing it to 50-100 when you start up your app. note that I don't think this is your problem, but trying this is easier than the other stuff below. :-)
  • another possibility is that you're swamping the server. This is usually hard to do with a single-threaded client, but is possible since multiple other clients may be hitting the server also. But because the problem always happens at #25, this seems unlikely since you'd expect to see more variation.
  • you may be running into a problem with keepalive HTTP connections backing up between your client and the server. this also seems unlikely.
  • the hard cutoff of 25 makes me think that this may be a proxy or firewall limit, either on your end or the server's, where >25 connections made from one client IP to one server (or proxy) will get throttled.

My money is on the latter one, since the fact that it always breaks at a nice round number of requests, and that stepping in the debugger (aka slower!) doesn't trigger the problem.

To test all this, I'd start with the easy thing: stick in a delay (Thread.Sleep) before each HTTP call, and see if the problem goes away. If it does, reduce the delay until the problem comes back. If it doesn't, increase the delay up to a large number (e.g. 10 seconds) until the problem goes away. If it doesn't go away with a 10 second delay, that's truly a mystery and I'd need more info to diagnose.

If it does go away with a delay, then you need to figure out why-- and whether the limit is permanent (e.g. server's firewall which you can't change) or something you can change. To get more info, you'll want to time the requests (e.g. check DateTime.Now before and after each call) to see if you see a pattern. If the timings are all consistent and suddenly get huge, that suggests a network/firewall/proxy throttling. If the timings gradually increase, that suggests a server you're gradually overloading and lengthening its request queue.

In addition to timing the requests, I'd set the timeout of your webclient calls to be longer, so you can figure out if the timeout is infinite or just a bit longer than the default. To do this, you'll need an alternative to the WebClient class, since it doesn't support a timeout. This thread on MSDN Forums has a reasonable alternative code sample.

An alternative to adding timing in your code is to use Fiddler:

  • download fiddler and start it up.
  • set your webclient code's Proxy property to point to the fiddler proxy (localhost:8888)
  • run your app and look at fiddler.
Justin Grant
Agreed, it definitely sounds like a connection-limit issue. That might also explain why stepping through in the debugger "fixes" it, because some connections get freed up while the dev is sitting in the debugger.
EricLaw -MSFT-
I'd also recommend using "CurrPorts" utility (http://www.nirsoft.net/utils/cports.html) that will display currently open connections in a nice GUI (or you can use Windows built-in "netstat" command).
Milan Gardian
+1  A: 

Hello,

it seems that WebClient is not closing the Response object it uses when done which will cause, in your case, many responses to be opened at the same time and with a limit of 25 connections on the remote server, you got the 'Timeout exception'. When you debug, early opened reponses get closed due to their inner timeout, etc... (I inpected WebClient that with Reflector, I can't find an instruction for closing the response).

I propse that you use HttpWebRequest & HttpWebResponse so that you can clean objects after each download:

HttpWebRequest request;
HttpWebResponse response = null;

try
{

   FileStream fs;
   Stream s;
   byte[] read;
   int count;

   read = new byte[256];

   request = (HttpWebRequest)WebRequest.Create(remoteFilename);
   request.Timeout = 30000;
   request.AllowWriteStreamBuffering = false;

   response = (HttpWebResponse)request.GetResponse();
   s = response.GetResponseStream();  

   fs = new FileStream(localFilename, FileMode.Create);   
   while((count = s.Read(read, 0, read.Length))> 0)
   {
      fs.Write(read, 0, count);
      count = s.Read(read, 0, read.Length);
   }

   fs.Close();
   s.Close();
}
catch (System.Net.WebException)
{
    //....
}finally
{
   //Close Response
   if (response != null)
      response.Close();
}
najmeddine