I have a C# app where a server and some clients communicate from different machines using sockets.
Most of the time, the server detects a dis-connect correctly when it receives 0 bytes in the sock.Receive(...) call. But when there is a hardware issue (say a network cable is unplugged), there is a problem. One server thread continues to block on sock.Receive(...) because it doesn't know the connection is lost. I was going to add a heartbeat message to detect this, but I wanted to test it in dev.
But I'm not sure how I can test this case without an actual hardware issue. Even when I just kill the client process, the socket somehow manages to dis-connect gracefully (that is, the server does a read of 0 bytes). It's only when I physically unplug the client machine from the network that I see this issue.
Is there any way that I can simulate this issue in dev?