Patrick Dixon;288459 Wrote: > ... and a highly accurate timing reference. Yes that is true and that is why S/PDIF is architecturally brain dead. I don't like it for that reason.
However, some DAC manufacturers e.g. Benchmark have published measurements which show that output distortion doesn't increase with increased input jitter (up to a silly maximum of input jitter). To me this is strong evidence of isolation from the S/PDIF-borne clock. In other words, immunity from input jitter in real world situations. So, in some cases - one might name these cases as 'decent DACs' - the main S/PDIF problem has been circumvented. Now, one may question the truth of these measurements. I note we are on a manufacturers forum here, and I am not inclined to believe published measurements are plain lies. And it is not only Benchmark who makes these claims. Secondly, one can imagine how in principle it is possible to isolate the S/PDIF clock using buffering techniques (albeit not without challenges). That doesn't make me like S/PDIF any more though. I think it's still dumb. :) Darren -- darrenyeats SB3 / Inguz -> Krell KAV-300i (pre bypass) -> PMC AB-1 Dell laptop -> JVC UX-C30 mini system ------------------------------------------------------------------------ darrenyeats's Profile: http://forums.slimdevices.com/member.php?userid=10799 View this thread: http://forums.slimdevices.com/showthread.php?t=45561 _______________________________________________ audiophiles mailing list [email protected] http://lists.slimdevices.com/lists/listinfo/audiophiles
