I am writing a networking application. It has some unxpected lags. I need to calculate some figures but I cant find an information - how many bits can be transferes through Ethernet connection at each tick.
I know that the resulting transfer rate is 100Mbps/1Gbps. But ethernet should use hardware ticks to sync both ends I suppose. So it moves data in ticks.
So the question is how many ticks per second or how many bits per one tick used in ethernet.
The actual connection is 100 Mbps full-duplex.