There are several factors:

1) Clock rate of the line
2) Buffering delay by any intermediary devices such as ATM/FR switches
3) Speed of light

If we take a simple case and say that there are no layer 2 devices in the
path and only digital cross-connects.  I have read (somewhere) that the
speed of electron transference in copper is a little faster than the speed
of light in fiber over short distance, so use the speed of light in fiber
(roughly .7 X 186,000 miles per second) as the baseline. (note that the
reference given by another poster says the speed of electromagnetic signals
in copper is .66 of the speed of light, which would mean it is slightly
slower than speed of light in fiber, either way its pretty close to a wash)
Given these assumptions you get:

speed of a single bit = speed of line insertion for 1 bit + speed of light
delay + speed of line removal for 1 bit

speed of line insertion for 1 bit = speed of line removal for 1 bit =
1/clock rate

speed of light delay = number of miles / (.7 * 186000 miles per second)



As an example, for a clock rate of 128Kbps and a distance of 1000 miles:

speed of line insertion and removal for 1 bit = 2 * (1/128000) = .000015625
sec = .015625 ms

speed of light delay = 1000 / (.7 * 186000) = .00768 sec = 7.68 ms

7.68 ms + .015625 ms = 7.7 ms (roughly)

Again, this assumes no delay in buffering in the path of any kind.  It also
assumes that there is no congestion at either end of the link.  Bottom line,
keep in mind these are rough numbers, but I think you get the idea.

HTH,
Kent


-----Original Message-----
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]On Behalf Of
Matthew Tayler
Sent: Thursday, April 11, 2002 9:01 AM
To: [EMAIL PROTECTED]
Subject: How fast do bits travel ? [7:41192]


Ok I have spent ages trying to find an answer to this question, and probably
only added to my confusion. You know how it is you spend ages looking at
something and become snow blind or get tunnel vision or whatever, but I
cannot see the answer to the following:

How far does a bit travel in say 1 second or put another way how long does a
bit take to travel a certain distance ?

I understand, or think I do that if the line is say 128kbps then I can, in
theory at least, expect 128,000 (approx) bits start down that line every
second.

But how long do they take to reach the other end, assuming a point to point
link and both ends being the same speed, obviously.

There has to be a nice simple formula for this somewhere, you know the sort
of thing x= line speed, y = distance z = time etc

Any ideas or poitners would be appreciated

Thanks




Message Posted at:
http://www.groupstudy.com/form/read.php?f=7&i=41210&t=41192
--------------------------------------------------
FAQ, list archives, and subscription info: http://www.groupstudy.com/list/cisco.html
Report misconduct and Nondisclosure violations to [EMAIL PROTECTED]

Reply via email to