How should I check a (TCP) socket to find out whether it is connected?
I have read about the Socket.Connected
property in MSDN, but it says it only shows the state according to the last I/O. This isn't useful for me, since I want to do this before trying to read from the socket. The remarks section also notes that:
If you need to determine the current state of the connection, make a nonblocking, zero-byte Send call. If the call returns successfully or throws a WAEWOULDBLOCK error code (10035), then the socket is still connected; otherwise, the socket is no longer connected.
The example on the same page shows how to do it.(1) But a post by Ian Griffiths says that I should read from the socket, not send through it.
Another post by Pete Duniho says:
... after you've called
Shutdown()
, callReceive()
until it returns0
(assuming the remote endpoint isn't actually going to send you anything, that will happen as soon as the remote endpoint has received all of your data). Unless you do that, you have no assurance that the remote endpoint has actually received all of the data you sent, even using a lingering socket.
I don't really understand his statement about calling Receive()
to make sure that the remote endpoint has actually received all the data I sent. (Do sockets block receiving until the sending buffer is empty?)
I am confused by the different methods proposed. Could you please explain them?
(1) I wonder why the example for the Socket.Connected
property allocates a 1-byte array, even though it calls Send
with 0 length?