On 15/10/2014, r...@audioimagination.com <r...@audioimagination.com> wrote:
> sorry, Peter, but we be unimpressed.

I gave you a practical, working *algorithm*, that does *something*.

In my opinion, it (roughly) approximates 'expected entropy', and I
found various practical real-world uses of this algorithm for
categorizing arbitrary data.

Apparently, several other people also agree with me that the measure
of "entropy" is the measure of "randomness", again, let me quote this
very relevant paper:

^ Marek Lesniewicz (2014) Expected Entropy as a Measure and Criterion
of Randomness of Binary Sequences [1] In Przeglad Elektrotechniczny,
Volume 90, pp. 42– 46.

So it seems, I'm not the only one on this planet, who thinks _exactly_
this way. Therefore, I think your argument is invalid, or all the
other people who wrote those scientific entropy estimation papers are
_all_ also "crackpots".

If you think my algorithm doesn't approximate "entropy", because your
religion says otherwise... then my question to you is:

"What does it approximate then?"
--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp

Reply via email to