It's really not a big issue - as long as you come up with a good protocol. The bits on the wire don't care what language was used to write the program that interprets them.
By protocol I mean you need a clear understanding of who should do what when, what to say,
and what to expect back.
For example, an echo server and its client might expect that
- the client speaks first
- there's a well understood identifier to mark the end of the client's input
- the client will wait for the server to respond before speaking again
- there's a well understood identifier to mark the end of the server's output.
And so on. A different client that sent the message length before the message instead of using a end of input identifier wouldn't work with the original server, because of protocol incompatibilities.
Protocols can be flexible, but you have to account for that flexibility in the protocol itself.