Golden Earring wrote: 
> Morning Arny!
> 
> I wondered if you had any direct comments on the research paper that I
> referenced (which may or may not prove to be good science depending on
> whether the claimed results are confirmed independently). The test
> results were intriguing if valid.
> 

On the plus side Bill Waslo is one of the good guys and knows audio
technology reasonably well.

On the minus, I see the need, but also am highly critical of the
analysis of difference signals.

If you are dealing with irrational audiophiles, then tools like the
Diffmaker make some sense because at the very lowest levels,
difference-taking can be used to enhance the sensitivity of *REAL* audio
analysis hardware and software which is typically FFT-based. *The
problem with differences is that they are inherently
non-discriminatory.* 

The real-world situation is that signals where the errors are reliably
perceptible by humans, everything is clear and readily measurable.
Remember, the threshold of audibility for all audible artifacts is on
the order of 60 or 70 _or_less__ dB down if you get your gains set
right. At those levels one can measure things to the point of exhaustion
with low cost readily available gear and software.  Any artifact that is
say 100 or less  dB down is not that hard or expensive to analyze to
death.

A lot of people don't know this but almost all modern audio test gear
first digitizes the signal, often with an off-the-shelf
analog-to-digital converter. Almost all  modern pro audio signal
processing gear with analog inputs works in a similar fashion,
converting the signal to the digital domain, and then using processing
in the digital domain to obtain the desired audible effects. Currently,
pro audio gear costing $200 and up can be front ended by an ADC that has
about -113 dB artifacts of its own. For testing purposes, you don't need
anything better.

So the question becomes do we need any better measurements than these to
do technical tests with. In almost every case in the rational world  the
answer is *no*. So then, differencing becomes a solution that is looking
for a problem because all of the realistic problems are already solved
by other, highly conventional means that give more precise and more
detailed answers.

With differencing and other add-ons, measurements in the -140 to 160 dB
range are possible. In this range, even ordinary copper wire and
resistors have measureable distortion. But pinch yourself, we are
talking twice the normal regular threshold of hearing *on a logarithmic
scale*.  Fun, but mostly to impress people who don't know any better.

Correct me if I'm wrong here but I believe Shannon's sampling theory
proof based on Nyquist's earlier conjecture (which only concerned Morse
code, a digital source) if the sampling frequency is not completely
regular is dependent upon the noise components mixed with the signal
being "i.i.d." (independent & identically distributed) & that if that is
*-not-* the case there may be theoretical problems in the interpolation
process necessary for the reconstruction of the original analogue signal
(I think that the Cheung-Marks Theorem covers an extreme aspect of this,
where they show the addition of an arbitrarily small amount of
*-non--*i.i.d. noise, such as that arising from quantisation errors, may
make the reconstruction process "ill-posed" which is maths jargon for
saying it no longer has a unique solution), IOW -the precise job that
the DAC is attempting-, and in particular that suggestions made by
Shannon in attempting to generalise his results to irregular sampling
intervals are not correct. Even small instabilities might have some
effect following this line of thought.

This is all rather heavy stuff, & I'm guessing that any audible
differences would tend to arise in the quiet passages of source material
with a high dynamic range where the difference between the signal
amplitude & the noise floor is reduced. Just perhaps some people may be
attuned to "digital jitter" in this wider sense of including clock
drifting inaccuracies, in an analogous way to the fact that you found
early CD's (which did have some problems of their own in terms of
engineering quality of recording) preferable to analogue, whereas at
that stage I definitely preferred my (mature technology) analogue
set-up.

It's just a thought. Obviously we don't capture the full concert hall
dynamic range of an orchestra even with our digital recordings, & if we
did either the quiet parts would be smothered by ambient noise or the
loud parts would make our ears bleed in the context of domestic
listening. It's a question of producing a subjectively satisfying
illusion of the underlying musical performance ultimately which may
leave -some- "wriggle-room" for individual preferences yet.

Still staying open minded atm this side of the pond.

Dave :)


------------------------------------------------------------------------
arnyk's Profile: http://forums.slimdevices.com/member.php?userid=64365
View this thread: http://forums.slimdevices.com/showthread.php?t=106519

_______________________________________________
audiophiles mailing list
[email protected]
http://lists.slimdevices.com/mailman/listinfo/audiophiles

Reply via email to