On 14/07/2015, Ethan Duni <ethan.d...@gmail.com> wrote:
> They're both poor estimators, since they both give non-zero results for the
> entropy rate of a pulse wave (of whatever duty cycle).

So far my best entropy estimator algorithm using sophisticated
correlation analysis, gave entropy rate of 1 for white noise, and
between 0.0002-0.02 for perodic square waves of varying frequencies.
Based on that, getting closer to zero for a fully periodic signal in a
practical entropy estimation algorithm has limits:

- Windowing artifacts: unless the analysis window length is an exact
multiple of the period length, the truncated edges of the analysis
window will give some artifacts, making the entropy estimate nonzero
(similar to DFT windowing artifacts).

- Quantization artifacts: unless the period cycle is an integer
number, each cycle will be slightly different due to various
quantization artifacts. Unless your algorithm models that, it will
make the entropy estimate nonzero.

- Computational limit: I can increase the estimate precision for
periodic square waves to around 0.000001-0.001 (within 0.1% error),
but then the algorithm becomes really slow. An infinite long analysis
window would need infinite computation.

- Uncertainity: how do you know that the output of some black-box
process is truly deterministic? Answer: you can't know. You can only
measure 'observed' probabilities, so the 'measured' entropy never
reaches zero in a finite amount of time.

Of course you can "cheat" and simply hard-code a threshold in your
algorithm, below which it just assumes that the signal is fully
deterministic, and just output zero. Where you put that threshold of
'close enough' is entirely arbitrary. (Most real-world signals are
never fully periodic, so that may be of not much use.)

I wouldn't call an entropy estimator that gives a result within 2%
error a "poor" estimator, despite it doesn't give zero for (seemingly)
determinisitic signals. (It's just "seemingly" deterministic, you can
never fully know unless you make infinite observations.) A simpler
estimate will have a different error distribution with more error.

After several experiments, I conclude: yes, you can make fairly high
precision practical entropy estimators for arbitrary signals. Where
you put the limit of "good enough" depends on your application. (Do
you really need an infinite long sinc kernel to reconstruct your
digital samples? Typically there's a filter that is "good enough" and
computable.)

"You insist that there is something a machine cannot do. If you tell
me precisely what it is a machine cannot do, then I can always make a
machine which will do just that." -John von Neumann

-P
--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp

Reply via email to