views:

45

answers:

1

Hi Guys,

I am solving some examples from Data Networks by Gallager and i didnt quite understand this particular question. The question goes as follows:

Suppose the expected frame length on a link is 1000 bits and the standard deviation is 500 bits Find the expected time and standard deviation of the time required to transfer a million frames on a 9600bps link. The follow up question is: The rate at which frames can be transmitted is generally defined as the reciprocal of the expected transmission time. Using the result u find in the previous problem, discuss whether this definition is reasonable.

My attempt is as follows:

Time to send one frame is : 1000/9600 = 0.104 seconds Hence, time to send million frames is = 104, 000 seconds

I did not understand the second part, where it asks to find standard deviation and the follow up question. I also didnt understand what does it mean when we say standard deviation of 500bits, does that mean error loss, which over here is 50%?

This is not a homework problem. I have a midterm in a few days, and im solving these to improve my grip on the subject.

Any help/hints will be appreciated

Thanks, Chander

+1  A: 

Assuming the distributions are normal then you have a sum of normally distributed variables. In this case both the expectation and the variance are easy to compute, you can just add them.

1 frame ~ N(1000,500)
1 mil frams ~ (1E9,5E8)

to get the times you just divide both by 9600.

E[time] ~= 104167 seconds
std[time] ~= 52083 seconds

One detail is that it is stupid for them to use a normal distribution since there is some non-zero probability a frame has negative bits. And consequently there is some non-zero chance you will send all of your (negative) bits in negative time... Since they don't specify a distribution I don't see what else they could have meant though.

anonymous_21321