P Floding wrote:
> Yeah, well..
> Of course a lot of technical ignorants might listen for differences
> that should be impossible. On the other hand, could you explain to me
> how the FLAC gets converted to 16/44 inside the SB3 without -anything-
> different going on compared to playing WAV?


Well, of course something different "goes on" - the flac data has to be
decoded to PCM by firmware routines within the SB. i.e.

1. When you playback a .wav file natively, it is streamed to the SB as
PCM data, received by the network "module" and fed to the DAC.

2. When you playback a .flac file natively, it is streamed to the SB as
flac data, received by the network module and fed to an implementation
in firmware of the flac decoding routines. This produces PCM data which
is fed to the DAC.

3. When you playback a .flac file with server-side conversion, it is
converted to PCM data on the server and streamed to the SB as PCM data,
received by the network "module" and fed to the DAC.

Either way, the DAC is (or should be) receiving *exactly* the same bits
(since flac is lossless).

The only ways I can think of that could possibly cause any difference are:

1. flac decoding routine in SB firmware is not correct - unlikely, and I
seem to remember that it has been confirmed that the decoding is
accurate by recording SPDIF data from the digital out and comparing to
the original PCM data.

2. The decoded PCM data is fed to the DAC in a different way than PCM
data received directly from the network with possibly differing clock
stability and resulting difference in jitter. Again, unlikely.

3. The very act of running the flac conversion routine on silicon inside
the SB causes interference with other parts of the SB (EMF, change in
current draw, voltage drops, etc.).

R.

_______________________________________________
audiophiles mailing list
[email protected]
http://lists.slimdevices.com/lists/listinfo/audiophiles

Reply via email to