views:

164

answers:

5

I answered that it would depend on a variety of factors including network latency, packet creation and collation etc. But the interviewer wanted a specific number, to which i answered something like 10 Secs :) ( > 8 secs ). Is there an exact formula for this?

A: 

If it was simply '1GB of data' then 8 seconds is correct, as you're probably talking purely about the transport layer.

But saying a 1GB file implies you're higher up the network stack, and like you say, other factors come into play as the transmission goes down through the stack on the sending end, and back up on the receiving end.

So I think you answered correctly!

Paul Dixon
+2  A: 

Yes in theory, no in practice. (you were completely correct about the real world, it depends on a squillion factors, which slow down the link, add latency, and pauses etc etc etc).

In theory though, an "1GBS" connection runs at One Giga-Bits Per second. while an "1GB" file will be One Giga-Bytes in size. A byte is 8 bits, and therefore it would take 8 seconds to transfer. In an interview, I imagine it was this answer that he was after.

Edited: For One - Eight confuddelation.

Andrew M
+1  A: 

1 GibiByte or 1 GigaByte?

1 GiB = 2^10 = 1 073 741 824 bytes = 8.58 seconds at 1Gbps
1 GB = 10^9 = 1 000 000 000 bytes

Sam
Neither, http://en.wikipedia.org/wiki/Gigabit_per_second
bronzebeard
A: 

If the file is on a RAM drive, then slightly more than 8 seconds. Without using jumbo frames, you're looking at the ethernet overhead of something like 5%. If the file is on a hard drive, then the lower bound of what's likely with typical hardware is 20 seconds, limited by the read speed of the drive.