I have a Java application with three threads that open, each, a socket and connect to a server on different ports. I set so_timeout on each of these sockets after the connection to the server is established. After that the threads block waiting on read(). Only one of the threads times out after 20 seconds (which is the timeout I set). The other two ignore the timeout. Is it possible that the TCP layer handles only one timeout at a time? Is there any other explanation?
A:
The documentation says:
The option must be enabled prior to entering the blocking operation to have effect.
maybe you should set it before the connection to the server is established, at least before calling read() on the socket.
But hard to say without the code...
Carlos Heuberger
2009-08-20 13:46:48
+1
A:
I've had several problems in the past dealing with SO_TIMEOUT in windows. I believe setting this is "supposed" to set the underlying socket implementation that could be OS dependent and conflicting with registry settings and such.
My advice is to not use SO_TIMEOUT to force a thrown exception on a timeout. Use either non-blocking I/O or check that you have bytes available() before you read().
Nick
2009-08-20 20:43:10