There are stories about how these impedances (50 and 75 ohms) were selected - everything from high science, to the particular copper tubing sizes that happened to be available to the experimenters.
Someone said that real coaxial line designs have an optimum impedance somewhere around 50 to 75 ohms, that produces a cable with minimum-loss, based on real conductor material, etc. I can't vouch for that, but it may explain why we usually don't see 1000-ohm coax cable. A typical ham half-wave dipole will really be about 50 ohms impedance, because it is fairly close to the earth. But put it up above 1/4 wavelength high, and it might be 100 ohms. Bacon, WA3WDR ______________________________________________________________ AMRadio mailing list List Rules (must read!): http://w5ami.net/amradiofaq.html List Home: http://mailman.qth.net/mailman/listinfo/amradio Partner Website: http://www.amfone.net Help: http://mailman.qth.net/mmfaq.html Post: mailto:[email protected] To unsubscribe, send an email to [EMAIL PROTECTED] with the word unsubscribe in the message body.

