I've created a program that loads data from an HTTP request. It has to load it every 1 sec or 0.25 sec.
However I've found that that many of my request - a few every minute - are timed out. It's weird because trying to reload the same address in a web browser, even when reloading very often, I always get the contents quickly.
This is the code I use:
HttpWebRequest myRequest = (HttpWebRequest)WebRequest.Create(url);
myRequest.Method = "GET";
myRequest.Timeout = 1500;
myRequest.KeepAlive = false;
myRequest.ContentType = "text/xml";
WebResponse myResponse;
try
{
myResponse = myRequest.GetResponse();
}
catch (Exception e)
{
return e.Message;
}
StreamReader sr = new StreamReader(myResponse.GetResponseStream(), System.Text.Encoding.UTF8);
string result = sr.ReadToEnd();
sr.Close();
myResponse.Close();
And the error I get is:
The operation has timed out
I've tried with both keepalive set to true and false.
So the question is - is there any way to reduce the amount of timeouts? Every request I request is crucial. I'm pretty sure it's relevant to the fact that I query the same address often, but surely there's a way to prevent it...?
Thanks a lot!