In article , 
[EMAIL PROTECTED] says...
> Ok I have spent ages trying to find an answer to this question, and
probably
> only added to my confusion. You know how it is you spend ages looking at
> something and become snow blind or get tunnel vision or whatever, but I
> cannot see the answer to the following:
> 
> How far does a bit travel in say 1 second or put another way how long does
a
> bit take to travel a certain distance ?
> 
> I understand, or think I do that if the line is say 128kbps then I can, in
> theory at least, expect 128,000 (approx) bits start down that line every
> second.
> 
> But how long do they take to reach the other end, assuming a point to point
> link and both ends being the same speed, obviously.
> 
> There has to be a nice simple formula for this somewhere, you know the sort
> of thing x= line speed, y = distance z = time etc
> 
> Any ideas or poitners would be appreciated
> 
> Thanks
It depends on the medium used. The rough calculation for the propagation 
of an electromagnetic signal in wire is 66% of the speed of light in a 
vacuum. Or 186000 x 2/3 = 124000 miles/sec.

Measurement on Cat5 are very close to this number. Coax is different. 
More on the order of 55% of the speed of light in a vacuum.

If you are very interested in this, check out 
xxx.lanl.gov/abs/physics/0201053
-- 
Wes Knight
MCT, MCSE, CNE, CCNP, ASE, etc.




Message Posted at:
http://www.groupstudy.com/form/read.php?f=7&i=41197&t=41192
--------------------------------------------------
FAQ, list archives, and subscription info: http://www.groupstudy.com/list/cisco.html
Report misconduct and Nondisclosure violations to [EMAIL PROTECTED]

Reply via email to