If a tcp server and client are connected, I'd like to determine when the client is no longer connected. I thought I can simply do this by attempting to send a message to the client and once send() returns with a -1, I can then tear down the socket. This implementation works find on Windows but the minute I try doing this on Linux with BSD sockets, the call to send() on the server side app causes my server app to crash if the client is no longer connected. It doesn't even return a -1...just terminates the program.
Please explain why this is happening. Thanks in advance!