I have a client/server connection over a TCP socket, with the server writing to the client as fast as it can.
Looking over my network activity, the production client receives data at around 2.5 Mb/s.
A new lightweight client that I wrote to just read and benchmark the rate, has a rate of about 5.0Mb/s (Which is probably around the max speed the server can transmit).
I was wondering what governs the rates here, since the client sends no data to the server to tell it about any rate limits.