I have a very simple program written in 5 min that opens a sever socket and loops through the request and prints to the screen the bytes sent to it.
I then tried to benchmark how many connections I can hammer it with to try to find out how many concurrent users I can support with this program.
On another machine (where the network between them is not saturated) I created a simple program that goes into a loop and connects to the server machine and send the bytes "hello world".
When the loop is 1000-3000 the client finishes with all requests sent. When the loop goes beyond 5000 it starts to have time outs after finish the first X number of requests. Why is this? I have made sure to close my socket in the loop.
Can you only create so many connections within a certain period of time?
Is this limit only applicable between the same machines and I need not worry about this in production where 5000+ requests are all coming from different machines?