tags:

views:

8

answers:

1

Hi,

What are the performance considerations that one should take into account when designing a server application that listens on one port? I know that it is possible for many thousands of clients to connect to a server on a single port, but is performance negatively affected by having the server application accept all incoming requests on a single port?

Is my understanding correct that this model (server listening on one port which handles incoming connections, and then responds back over the outbound connection that was created when the client connection was established) is the way databases/webservers etc work?

Regards, Brian

A: 

It does not matter if the server listens on one port or multiple ports. The server still has to establish a full socket connection for each client it accepts. The OS still has to route inbound packets to the correct socket endpoints, and sockets are uniquely identified by the combination of IP/Port pairs of both endpoints, so there is no performance issues if the server endpoints use different ports.

Any performance issues are going to be in the way the server's code handles those socket connections. If it only listens on one port, and accepts clients on that port using a simple accept() loop in a single thread, then the rate that it can accept clients is limited by that loop. Typically, servers spawn worker threads for each accepted client, which in itself has performance overhead of its own if thread pooling is not used. If the server needs to handle a lot of clients simultaneously, then it should use overlapped I/O or I/O Completion Ports to handle the connections more efficiently.

Remy Lebeau - TeamB