I have a Java nonblocking server that keeps track of all the socket channels in a selector. I then establish 500 connections to the server and send data regularly. Every piece of data the server receives is echoed back to the client.
The problem comes where the test works wonderfully for a couple of hours and then all of the sudden gradually all of the sockets the server is managing throw a Connection timed out IOException when attempting to read data.
I've looked into whether or not the client thread was being starved (and not sending data), but I am yielding to the client thread that iterates through all the sockets and writes out data. Traffic seems to be constantly flowing properly, but after a while it just all dies out. Any ideas what could be causing this behavior?
I'm running on a Linux platform with the latest iteration of Java 6. My application launches two threads, one for the server, and one for all the clients. Thanks in advance!
Extra: The issue is dealing with Linux and not my code. When I run the exact same setup on a Windows box (on the same hardware) it never times out, but after several hours they start to occur on Linux. It must be some kind of TCP setting in Linux that's causing it to happen. Thanks for the suggestion.