AndyC_772;197449 Wrote: > It's always safe to apply formulae from transmission line theory to what > is, undeniably, a transmission line. If it turns out to be a very short > one, and therefore the effects of an impedance mismatch are very small, > then that'll drop out of the equations.
Changing the connector to one with a different resistance will change more than just the amount of reflections. For example its impedance (not considering reflections, just plain old circuit impedance) will filter the signal and affect the jitter spectrum. In some situations (probably including speaker cables and analogue interconnects) that could be much more important than any transmission line effects. > > A 1 metre interconnect is NOT short, though. As per my previous post, > something which even digital design engineers sometimes fail to > properly grasp, is that it's not the data rate that matters, but the > edge speed - dv/dt. The rise/fall time of the SB3's digital output is > around 10ns, and the flight time of an edge along a 1m coax cable is > about 4ns. > > Now consider exactly why an impedance mismatch might affect received > jitter, bearing in mind that jitter is uncertainty about the timing of > an edge at the receiver: > > Suppose a rising edge is driven by the SB3 into a poorly terminated > transmission line. It propagates along the cable to the receiver, > reflects off the impedance mismatch and heads back to the transmitter. > After 4ns it reflects off the transmitter and starts heading back > toward the receiver, so 8ns after the main edge has hit the receiver, a > second, unwanted edge comes along and interferes with it. > > If you're lucky, the edge at the receiver has already crossed the 1<>0 > threshold, and the distorted wave shape is of no consequence. But, if > your cable is the wrong length, this reflected signal eppears right > when the receiver is trying to determine the timing of the 1<>0 > transition. Bad news. > > Note that this effect depends only on the cable length, the extent of > the impedance mismatch, and the dv/dt of the edge, and NOT on the data > rate. What you're saying here sounds like the standard treatment. The rise time, or the max dV/dt, is related to the max frequency present in the signal. If the cable is long compared to the wavelength associated with that max frequency, transmission line effects matter. The point is that 5MHz is the frequency of what is ideally a square wave, which contains arbitrarily large frequencies, and so for high harmonics even a meter cable is well into the transmission line regime. Therefore if reflections are significant one may expect the cable to introduce jitter at the few ns level - which of course matters only if the jitter at the SB output was less than that to begin with (which it is). What kind of spectrum will this sort of jitter have? -- opaqueice ------------------------------------------------------------------------ opaqueice's Profile: http://forums.slimdevices.com/member.php?userid=4234 View this thread: http://forums.slimdevices.com/showthread.php?t=34406 _______________________________________________ audiophiles mailing list [email protected] http://lists.slimdevices.com/lists/listinfo/audiophiles
