It's always safe to apply formulae from transmission line theory to what is, undeniably, a transmission line. If it turns out to be a very short one, and therefore the effects of an impedance mismatch are very small, then that'll drop out of the equations.
A 1 metre interconnect is NOT short, though. As per my previous post, something which even digital design engineers sometimes fail to properly grasp, is that it's not the data rate that matters, but the edge speed - dv/dt. The rise/fall time of the SB3's digital output is around 10ns, and the flight time of an edge along a 1m coax cable is about 4ns. Now consider exactly why an impedance mismatch might affect received jitter, bearing in mind that jitter is uncertainty about the timing of an edge at the receiver: Suppose a rising edge is driven by the SB3 into a poorly terminated transmission line. It propagates along the cable to the receiver, reflects off the impedance mismatch and heads back to the transmitter. After 4ns it reflects off the transmitter and starts heading back toward the receiver, so 8ns after the main edge has hit the receiver, a second, unwanted edge comes along and interferes with it. If you're lucky, the edge at the receiver has already crossed the 1<>0 threshold, and the distorted wave shape is of no consequence. But, if your cable is the wrong length, this reflected signal eppears right when the receiver is trying to determine the timing of the 1<>0 transition. Bad news. Note that this effect depends only on the cable length, the extent of the impedance mismatch, and the dv/dt of the edge, and NOT on the data rate. -- AndyC_772 ------------------------------------------------------------------------ AndyC_772's Profile: http://forums.slimdevices.com/member.php?userid=10472 View this thread: http://forums.slimdevices.com/showthread.php?t=34406 _______________________________________________ audiophiles mailing list [email protected] http://lists.slimdevices.com/lists/listinfo/audiophiles
