Hi Guys,
I am solving some examples from Data Networks by Gallager and i didnt quite understand this particular question. The question goes as follows:
Suppose the expected frame length on a link is 1000 bits and the standard deviation is 500 bits Find the expected time and standard deviation of the time required to transfer a million frames on a 9600bps link. The follow up question is: The rate at which frames can be transmitted is generally defined as the reciprocal of the expected transmission time. Using the result u find in the previous problem, discuss whether this definition is reasonable.
My attempt is as follows:
Time to send one frame is : 1000/9600 = 0.104 seconds Hence, time to send million frames is = 104, 000 seconds
I did not understand the second part, where it asks to find standard deviation and the follow up question. I also didnt understand what does it mean when we say standard deviation of 500bits, does that mean error loss, which over here is 50%?
This is not a homework problem. I have a midterm in a few days, and im solving these to improve my grip on the subject.
Any help/hints will be appreciated
Thanks, Chander