I have an server-application Foo that listens at a specific port and a client-application Bar which connects to Foo (both are .NET-apps).
Everything works fine. So far, so good?
But what happends when to Bar when the connection slows down or when it takes a long time until Foo responds? I have to test it.
My question is, how can I simulate such a slowdown?
Generally not that big problem (there are some free tools out there), but Foo and Bar are both running on productive machines (yes, there are developed on productive machines. I know, that's very bad, but believe me, that's not my decision). So I can't just use a tool that limits the whole bandwith of the network-adapters.
Is there a tool out there where I can limit the bandwith or delay the connection of a specific port? Is it possible to achieve this in .NET/C# so I can write a proper unit/integration-tests?