Claiming that 16 bits provides 96 dB of dynamic range ignores the fact
that the distortion rises as the number of bits decreases.  For an
analog system, it is OK to define the dynamic range as the ratio of the
loudest signal that can be produced to the quietest (or the noise
floor).  But I have a problem when they apply that same definition to a
sampled system.  Sure 16 bits implies 96 dB:  20*log(2^16) = 96dB).  But
that means you are willing to accept absurd amounts of distortion (100%)
as a reasonable "signal".  The distortion (due to quantization error)
increases as the number of bits decreases.  With 16 bits, the
distortion is very small (0.0015%), but each bit that is removed
doubles the distortion.  So a signal sampled at 6dB lower than the peak
level will have double the distortion.  How much distortion are you
willing to accept?  If you draw the line at 1% distortion (pretty high
in my opinion), then that requires 7 bits.  That means the dynamic
range of a 16 bit sampled system is only 54 dB:  20*log(2^(16-7))= 54dB
if you limit the distortion allowed to less than 1%.  About as good an
old cassette deck.  Not my definition of "Hi-Fi".  16 bits was a
compromise based on the available technology of the day.  Many people
still claim it is good enough.  Many others don't agree.

Terry


-- 
TerryS
------------------------------------------------------------------------
TerryS's Profile: http://forums.slimdevices.com/member.php?userid=40835
View this thread: http://forums.slimdevices.com/showthread.php?t=85879

_______________________________________________
discuss mailing list
[email protected]
http://lists.slimdevices.com/mailman/listinfo/discuss

Reply via email to