views:

388

answers:

2

Hello everyone,

I am using VSTS 2008 + C# + .Net 3.5 to develop a console application and I send request to another server (IIS 7.0 on Windows Server 2008). Here is my code. My question is, as in my code, I use a while loop to read chunks by chunk from server. The timeout request.Timeout = Timeout * 1000 is responsible for (1) timeout for open connection to server, or (2) timeout for each read operation, or (3) the total time used for the while loop?

    static void PerformanceWorker()
    {
        Stream dataStream = null;
        HttpWebRequest request = null;
        HttpWebResponse response = null;
        StreamReader reader = null;
        try
        {
            request = (HttpWebRequest)WebRequest.Create(TargetURL);
            request.Timeout = Timeout * 1000;
            request.Proxy = null;
            response = (HttpWebResponse)request.GetResponse();
            dataStream = response.GetResponseStream();
            reader = new StreamReader(dataStream);

            // 1 M at one time
            char[] c = new char[1000 * 10];

            while (reader.Read(c, 0, c.Length) > 0)
            {
                globalCounter++;
            }
        }
        catch (Exception ex)
        {
            lock (counterLock)
            {
                globalFailCounter++;
                Console.WriteLine("Fail Counter: " + globalFailCounter + "\n" + ex.Message + "\n" + ex.StackTrace);
            }
        }
        finally
        {
            if (null != reader)
            {
                reader.Close();
            }
            if (null != dataStream)
            {
                dataStream.Close();
            }
            if (null != response)
            {
                response.Close();
            }
        }
    }

thanks in advance, George

+3  A: 
  1. Timeout for open connection to server
Gary
Then how to set timeout for each read operation and for the while loop for the whole read operation?
George2
The "ReadWriteTimeout" property handles #2. You have to track #3 by yourself.
David
You mean ReadWriteTimeout works for each individual read/write operation in my loop?
George2
I have a related question here, appreciate if you could take a look.http://stackoverflow.com/questions/1598748/unable-to-connect-to-remote-server-fail-in-httpwebrequest
George2
+1  A: 

As from MSDN:

Timeout is the number of milliseconds that a subsequent synchronous request made with the GetResponse method waits for a response, and the GetRequestStream method waits for a stream. If the resource is not returned within the time-out period, the request throws a WebException with the Status property set to WebExceptionStatus.Timeout.

I'm in doubt if you can set timeout for read operation it easily without any low level trick. All data you are reading using response object is coming from network card buffer which is filled at the rate of your available network bandwidth. You would hit timeout at some point when performing reading when buffer is empty and no new data is coming from the sender end point.

P.S. this is more of a comment to the @Gary answer, maybe someone could move it there.

Audrius
You mean the HttpWebRequest.Timeout is only used for open connection?
George2
I have a related question here, appreciate if you could take a look.http://stackoverflow.com/questions/1598748/unable-to-connect-to-remote-server-fail-in-httpwebrequest
George2
Well Timeout is used for opening the connection and for all the following synchronous request. So calling GetResponse() will timeout. I suspect that after GetResponse() is done you already have all required data in network (card) buffer. When you perform Read() operation, you are not actually retrieving bits from the network, but the buffer itself, so timeout at this stage is not important. Please note that I am not 100% sure that this sis so simple, if you response data size is bigger than network buffer, probably at some stage you read call will block till buffer is filled with data again.
Audrius