Quotes from:
http://www.digido.com/portal/pmodule_id=11/pmdmode=fullscreen/pageadder_page_id=28/
First paragraph on that page:
"October 2002
I hesitate to remove this older article from our website, as it is still informative, but I highly recommend that those interested in the latest word on this subject please read the chapter on jitter in my new book. Some questions that this previous article has raised have been clarified in our letters section, and of course are covered much better in the book. -BK"
I have that book ("Mastering Audio - the art and the science" by Bob Katz) and in it he does indeed make a better job of explaining jitter, although it's not brilliant.
A better explanation is given here [1].
[1] http://www.jitter.de/english/engc_navfr.html
"Jitter is time-base error. It is caused by varying time delays in the circuit paths from component to component in the signal path." and "The only effect of timebase distortion is in the listening; as far as it can be proved, it has no effect on the dubbing of tapes or any digital to digital transfer (as long as the jitter is low enough to permit the data to be read."
This definition is erm, incomplete, to say the least.
Jitter is a timing problem, the digital data is not altered. In theory if we can manage to output the correct data at the exactly right time, there is no jitter.
Correct. However, that is precisely the problem - it is not easy to output the correct data at exactly the right time.
jitter is nothing to do with the buffer.
"Playback from a DAT recorder usually sounds better than the recording, because there is less jitter. Remember, a DAT machine on playback puts out numbers from an internal RAM buffer memory, locked to its internal crystal clock. A DAT machine that is recording (from its digital input) is locked to the source via its (relatively jittery) Phase Locked Loop. As the figure above illustrates, the numbers still get recorded correctly on tape, although their timebase was jittery while going in. Nevertheless, on playback, that time base error becomes irrelevant, for the numbers are reclocked by the DAT machine!" and
"I repeat: jitter does not affect D-D dubs, it only affects the D to A converter in the listening chain"
Buffers can eliminate any jitter you might have picked up up to this point in the signal path. But you still have to read from the buffer at the exact right intervals. If your power supply is not clean or the clock is inaccurate for some other reason you cannot do that and get jitter in the output. If you use the Squeezebox's DAC, the Squeezebox's clock is relevant, if you have a good seperate DAC only that matters as far as jitter is concerned.
No, buffers have no effect on jitter. Also, jitter introduced at the source stage affects subsequent digital stages, e.g. and external DAC. One of the reasons expensive CD transports can sound so much better than cheaper units is that they reduce jitter by using high-quality power supplies.
>jitter is independent of source format.
There have been multiple messages on this list station that output jitter from the 'box was a lot higher with .flac files (transferred from the server as PCM) than with .mp3 (decoded inside the 'box to PCM. Since I don't think there are two clock chips the clock chip can not account for the difference. Leaves the processing inside the squeezebox (mp3 decoding vs pass-through) and the bitrates of the data passed to it.
Yes, to reduce the jitter of .mp3s further you'd need to provide cleaner power or a better clock chip. But I was looking for ways to bring the jitter level of uncompressed down to .mp3 levels.
According to my tests the 'box has 256kb of buffer (A 128kbit/s mp3 plays for a tick over 15 seconds if I pull the network connector in mid-play.) The same buffer can not even hold one and a half seconds of uncompressed audio. A CPU spike on the server, a network usage spike or a bit of load can easily cause interruptions of this magnitude. Network speed is never constant even under the best conditions.
You are confusing "jitter" with playback interruptions. Perhaps this diagram will help:
slimserver---buffer---decoding---DAC---analogue out
|
+-------Digital outJitter is introduced at the decoding stage and is caused by the timing errors in the bitstream.
Even after having googled around a bit I stand by my assumption that the difference in output jitter is due to the different bitrates and that a bigger buffer would help towards closing the gap.
Constructive and specific criticism welcome :)
You don't understand what jitter is.
Read the link given above [1].
R. -- http://robinbowes.com
_______________________________________________ Discuss mailing list [email protected] http://lists.slimdevices.com/lists/listinfo/discuss
