> Comparing coaxial and Toslink, there is actually a measurable
> difference between what you get at the other end of the line. The
> archives of rec.audio.pro have discussions of this phenomenon, though
> it's been a number of years since I've kept up with this subject.
>
> The issue is "transport jitter", where the timing between the bits
> varies by some amount (in the range of 5 to 500 picoseconds, if memory
> serves). The bits received are in fact the same as those that were
> transmitted, so if you are transporting the bits with the intention of
> storing them (i.e., on a CD or an MD) it doesn't make a difference what
> you use. However, when you feed these timing variations into a D/A
> converter, it can affect the output waveforms.
On a sub standard D/A convertor, yeah.
Any decent DSP or DA convertor will have a PLL that will lock onto the
incoming clock thats salvaged out of the spdif stream and re-clock the data
onto that (accurate) clock before conversion.
> Inside single-box setups, clocking tends to be fairly jitter
> resistant. It can be a bigger issue if you put together systems with
> separate transport and D/A sections (e.g., a home theatre surround
> decoder fed by DVD). It's also an issue for pro audio setups where
> they have to transport audio around the studio or remote recording
> location for monitoring. There may have been advances in the past few
> years that reduce such effects; I'm not sure.
The problem in the pro areas is not jiter, but simply clock accuracy. You
cant (easily) digitally mix a source at 44105 Hz and 43995 Hz sampling - one
of the streams will run out of samples before the other.
There are 2 solutions to this, Sample rate convert everything to a good
master clock. Thats bad and messy as theres additional processing thats
un-needed, Or, Get "Pro" gear that will use an external clock so that they
are all at the same frequancy.
> Anyhow, Toslink tends to exhibit greater amounts of jitter than coax or
> other fiber optic media. If you don't hear a difference, don't let it
> bother you.
Old "Audiophile" A/D's were heavily affected by jitter, thats where this all
started from, they were designed by people when digital was new and they
knew analog. If you look up the specs on D/A chips by analog devices/burr
brown etc I would be surprised if they did not publish the maximum allowable
jitter on the input.
-----------------------------------------------------------------
To stop getting this list send a message containing just the word
"unsubscribe" to [EMAIL PROTECTED]