dsdreamer wrote:
> Skinny;382055 Wrote: 
>> I'm with you on that. Bits will be bits.
>> With you on that as well.
>>
> I would have once said that bits are bits (and so they are) but without
> an accurate clock bits can't be reproduced into the waveform they
> represent. 
> 
<...cut...>

OMG, it's digi..., No - wait..., I see your point...

Let me see if I get this right:

The availability of the digital sample to the analog portion of the amp 
inevitably has jitter (is not perfectly periodic). Compromising the 
digital signal (yes yes, even if it incorporates schemes for error 
correcting it's own data) may impact this jitter.

The shear act of digitizing music will add some artefacts to the sound 
(To the 3rd party readers: "What?  You didn't know that?").  But these 
are predictable, mostly in-audible and are routinely dealt with (i.e. 
mostly removed).

Now, you are saying, by compromising the spdif signal, one might 
increase this jitter.  And, in turn "spread out" these unwanted 
artefacts.  And, by doing so, circumvent how they are normally dealt with.

> The only real controversy is: how good is good enough? 

Agreed!

Look, an spdif signal is not an OC-192 (roughly the equivalent of 
30,000 phone calls) and I seriously doubt the A/D converter in your amp 
needs a stratum-3 clock to fight jitter problems.

What I'm trying to say: I believe most any normal coax cable under 
normal conditions is good enough.  But I can see optical cables having 
"less possible impact" with respect to jitter.

Now that I've voted for optical over coax, I seriously doubt there will 
be any measurable difference between a $5 and $30 2 meter optical cable.

FYI:
- I don't own a tube amplifier.
+ I do own direct press records.
- I haven't played them in years.
+ ...
- ..



_______________________________________________
discuss mailing list
[email protected]
http://lists.slimdevices.com/lists/listinfo/discuss

Reply via email to