The speed the bits travel should be negligible in comparison with the time it takes the intermediate(routers, switch,...) and end nodes to receive/process the signal. So if you're curious for computational purposes it wouldn;t matter. Electricty in a vacuum travels at light speed. I'm not sure the effect a copper medium would have. It would probably be less due to interference and other electromagnetic influeneces.
sam sneed ""Matthew Tayler"" wrote in message [EMAIL PROTECTED]">news:[EMAIL PROTECTED]... > Ok I have spent ages trying to find an answer to this question, and probably > only added to my confusion. You know how it is you spend ages looking at > something and become snow blind or get tunnel vision or whatever, but I > cannot see the answer to the following: > > How far does a bit travel in say 1 second or put another way how long does a > bit take to travel a certain distance ? > > I understand, or think I do that if the line is say 128kbps then I can, in > theory at least, expect 128,000 (approx) bits start down that line every > second. > > But how long do they take to reach the other end, assuming a point to point > link and both ends being the same speed, obviously. > > There has to be a nice simple formula for this somewhere, you know the sort > of thing x= line speed, y = distance z = time etc > > Any ideas or poitners would be appreciated > > Thanks Message Posted at: http://www.groupstudy.com/form/read.php?f=7&i=41203&t=41192 -------------------------------------------------- FAQ, list archives, and subscription info: http://www.groupstudy.com/list/cisco.html Report misconduct and Nondisclosure violations to [EMAIL PROTECTED]

