Let's say I have a simple tcp server generating an array inside a thread, serializes it and sends it to a client over tcp connection and when it reaches the client, the client deserializes it and performs something...ie. continuous calculation. Well it's a pretty straight forward process but I would like to know if the process has any performance tradeoff? For example would the client get to do something as fast as the thread produces the arrays, I mean for example, if the thread produces 100 arrays in a second (100 m/s), would the client get 100 arrays in a second too? can the object be serialized and deserialized at real-time? would be pleased if anyone can explain this to me.
well of course ignore the unreliable tcp performance.
Thanks!!