JohnSwenson;656501 Wrote: 
> The issue of ground plane noise can cause problems even with an optical
> connection.
> 
> It has to do with the transmitter, either electrical (coax) or optical,
> the input to the transmitter has a "threshold", a voltage at which it
> sees the input as changing from a one to a zero, noise on the ground
> pin of the transmitter causes that threshold to move up and down
> causing the point at which it senses the signal as changing from one to
> zero to change. Since signals do not change instantaneously from a low
> to a high value, changing the threshold also changes the time at which
> the transmitter sees the change taking place. This happens with both
> electrical and optical.
> 
> The ground plane noise can also get coupled to the output as noise on
> the signal itself, this noise can cause the receiver to misinterpret
> when the change happens as well. Different receiver circuit vary
> significantly in how they handle noise on the signal. There is a famous
> example of a circuit that tried to be very immune to input noise, but
> they way it was implemented majorly screwed up the RF characteristics
> causing reflections on the line which were much worse than the noise it
> was trying to fix. 
> 
> This can also happen with optical, although because the common
> implementation is so poor to begin with noise on the signal is rarely
> noticed. 
> 
> John S.
I find this all very interesting. But my reading around the subject
suggested to me that jitter has to be really gross before the bits
cannot be read. This is because there is a preamble to each section in
a S/PDIF signal. You don't just read the 1s and 0s forming the sample
values. You read packets of data with redundancy and error correction
built in. It is relatively easy, as i understand it to identify the
preambles even if they are irregularly spaced due to jitter. You can
therefore tell where each packet of data starts and stops, and the
redundancy and error correpection allow you to extract the sample
values even where there a quite a lot of jitter (as far as i recall
well into the nanoseconds)(Do tell me if i have got that wrong)
Assuming that you have accurately read the bits, what is the problem?
We know the analog band limited signal  has to be reconstructed by a
dac operating at a sample rate of 44.1kHz (ignoring upsampling etc).
Aside from matching the data flows where does the problem arise?
Is it suggested that the effort of reading and re-clocking the signal
somehow adversely affects the operation of the Dac chip, presumably
because of power surges or something.
I really would find it helpful to grasp this point because whenever i
read "audiophile" explanations, they seem to be based upon the
assumption that the receiver cannot accurately read the data in the
S/PDIF stream to reconstruct the 1s and 0s forming the sample values,
or that the DAC is obliged to be clocked to equal a hidden  secret
sample rate which has to be extracted from  the S/PDIF signal.


-- 
adamdea
------------------------------------------------------------------------
adamdea's Profile: http://forums.slimdevices.com/member.php?userid=37603
View this thread: http://forums.slimdevices.com/showthread.php?t=84742

_______________________________________________
audiophiles mailing list
[email protected]
http://lists.slimdevices.com/mailman/listinfo/audiophiles

Reply via email to