I'm writing an application and I'm able to set its throughput (the number of bits per second it sends over the wire) to whatever rate I wish. However, I would like to set it as high as possible, as long as other traffic on the network is not heavily impacted.
The problem is, I don't have a good metric to measure that impact. I thought of the following ones, but neither is really "complete":
- Increase in average delay time for a packet
- Increase in packet loss
- Increase in jitter
- Increase in the average time it takes for tcp transactions to complete (downloading files using http)
Is there any standard metric? Do you have any other ideas on how to measure an application impact on the network?
btw - I have a complete control on the network, and can take whatever measurement that I want in order to compute that metric.
Thanks,
Rouli