Re: [music-dsp] entropy

2014-11-09 Thread colonel_hack
On Mon, 13 Oct 2014, Peter S wrote: So I found that the number of transitions correlates with randomness (statistically). Ok. By eyeball: not very random 0101001001110110111001100100011001101101 very random :-)

Re: [music-dsp] entropy

2014-10-21 Thread Richard Wentk
It would probably sound like a noise gate, and perhaps not terribly exciting. On the next level up, frequency domain and/or variable filter adaptive noise cancellation is a specialised but well-understood subset of DSP lore. (Try any cell phone, VOIP service, or video chat system for a demo.)

Re: [music-dsp] entropy

2014-10-21 Thread Risto Holopainen
20 oktober 2014, Max Little max.a.lit...@gmail.com skrev: Many times in the past I've found very unexpected applications to audio DSP of mathematical concepts which initially seemed entirely unrelated to DSP. Particularly, this happens in nonlinear DSP, which is a very broad area. So I don't,

Re: [music-dsp] entropy

2014-10-20 Thread Max Little
Then just do Shannon's definition over a space with equidistributed probability. Define it as so, and you have the precise same fundamental framework. Seriously, there is no difference. I might find myself in the situation where I am given a nonuniform distribution. Then Shannon and Hartley

Re: [music-dsp] entropy

2014-10-20 Thread Ethan Duni
I might find myself in the situation where I am given a nonuniform distribution. Then Shannon and Hartley formulas would give a different answer. And the Hartley formula would be inapplicable, since it assumes a uniform distribution. So you'd have to use the Shannon framework, both to calculate

Re: [music-dsp] entropy

2014-10-20 Thread Max Little
Shannon entropy generalizes Hartley entropy. Renyi entropy generalizes Shannon entropy. By transitivity, then, Renyi also generalizes Hartley. Well, Renyi directly generalizes Hartley. You can apply Hartley to any distribution, it doesn't have to be uniform. That's if we agree on the formula

Re: [music-dsp] entropy

2014-10-20 Thread Max Little
You can apply Hartley to any distribution, it doesn't have to be uniform. You don't apply Hartley to a distribution. You apply it to a *random variable*, Since when? You apply the Hartley formula to the distribution, as with all entropy-like formulas, such as Shannon's formula. These are

Re: [music-dsp] entropy

2014-10-20 Thread Andy Farnell
On Mon, Oct 20, 2014 at 10:00:13AM -0700, Ethan Duni wrote: Meanwhile, I'll point out that it's been a long time since anybody on this thread has even attempted to say anything even tangentially related to music dsp. The first thing that came to my mind after seeing Peter's image processing

Re: [music-dsp] entropy

2014-10-20 Thread Max Little
Many times in the past I've found very unexpected applications to audio DSP of mathematical concepts which initially seemed entirely unrelated to DSP. Particularly, this happens in nonlinear DSP, which is a very broad area. So I don't, as a rule, discount any mathematics prejudicially, because you

Re: [music-dsp] entropy

2014-10-19 Thread Sampo Syreeni
On 2014-10-14, Max Little wrote: Still, I might find myself finding a use for Hartley's 'entropy', maybe someday. I don't discount any maths really, I don't have any prejudices. Then just do Shannon's definition over a space with equidistributed probability. Define it as so, and you have

[music-dsp] entropy estimation

2014-10-18 Thread Peter S
Hi Everyone, Instead of long and boring discussions: let me show you something interesting. Entropy estimation in 2D sampled signals Original image http://scp.web.elte.hu/entropy/think.jpg Estimated entropy distribution http://scp.web.elte.hu/entropy/entropy.gif Doesn't this give you some

Re: [music-dsp] entropy

2014-10-16 Thread STEFFAN DIEDRICHSEN
Von meinem iPhone gesendet Am 16.10.2014 um 02:16 schrieb Paul Stoffregen p...@pjrc.com: On 10/15/2014 12:45 PM, Peter S wrote: I gave you a practical, working *algorithm*, that does *something*. In the 130 messages you've posted since your angry complaint regarding banishment from an

Re: [music-dsp] entropy

2014-10-16 Thread STEFFAN DIEDRICHSEN
Sorry for the low entropy message I sent. Paul, We never had any filters on this list and I think, that's good. I simply delete most of this thread without reading it. The risk of missing something is quite low. I liked the link to xkcd, that was a practical take away. Best, Steffan PS:

Re: [music-dsp] entropy

2014-10-16 Thread Peter S
On 16/10/2014, Paul Stoffregen p...@pjrc.com wrote: In the 130 messages you've posted since your angry complaint regarding banishment from an IRC channel nearly 2 weeks ago, I do not recall seeing any source code, nor any psuedo-code, equations or description that appeared to be practical and

Re: [music-dsp] entropy

2014-10-16 Thread Laszlo Toth
On Wed, 15 Oct 2014, Alan Wolfe wrote: For some reason, All I'm seeing are your emails Peter. not sure who you are chatting to or what they are saying in response :P Good point, I observed the same. Laszlo Toth Hungarian Academy of Sciences * Research Group

Re: [music-dsp] entropy

2014-10-16 Thread Peter S
Let me show one further way, what the Hartley entropy (or also called, the max entropy) means: It just means, what is the *maximum* amount of information your message can _possibly_ contain. To turn it into a simple, real-world example, let's imagine that you send me a single bit. In that case,

Re: [music-dsp] entropy

2014-10-16 Thread Peter S
In other words, what I'm saying: The *maximum* possible amount of information your message can contain inversely correlates with how much your symbols are correlated. By approximating the decorrelation between your symbols, I can approximate the *maximum* possible amount of information in your

Re: [music-dsp] entropy

2014-10-16 Thread Peter S
...and what I'm proposing is a simple, first-order decorrelation approximator, which in my belief, roughly approximates the *expected* entropy of an arbitrary message. For better and less biased decorrelation approximators, feel free to consult the scientific literature list I posted. --

Re: [music-dsp] entropy

2014-10-16 Thread Thomas Strathmann
On 10/16/2014 09:22 AM, Peter S wrote: entropy = 0 state = b(1) for i = 2 to N if state != b(i) then entropy = entropy + 1 end if state = b(i) end for But that's not additive is it? For a sequence of length one the algorithm always yields entropy = 0. But for a sequence

Re: [music-dsp] entropy

2014-10-16 Thread Peter S
If for some reason, _all_ that you can think of is _separate_ messages and probabilities, then let me translate this for you into simple Shannonian terms so that you can understand it clearly: Let's imagine that your have a message, which is essentially a string of bits. What we're trying to

Re: [music-dsp] entropy

2014-10-16 Thread Gunnar Eisenberg
Hi Peter, I’m really glad to see some new faces on this list from time to time. You told us a long introductory story of how bad people treated you on the old channel. Just out of curiosity and to get the whole picture right, how many messages per day did you drop of there in average? Did some

Re: [music-dsp] entropy

2014-10-16 Thread Peter S
Is that _all_ you can care about? I'm talking about a potential way of categorizing arbitrary data, and all you argue about is the number of messages. How is that even remotely relevant to the topic? -- dupswapdrop -- the music-dsp mailing list and website: subscription info, FAQ, source code

Re: [music-dsp] entropy

2014-10-16 Thread Peter S
Look, when I go into a subject, I like to go into it in detail. We barely even scratched the surface of the topics relevant to 'entropy'. All that we've been talking about so far is just the basics. Quantifying information is not something that can be discussed in depth in only a dozen messages

Re: [music-dsp] entropy

2014-10-16 Thread Peter S
Let me make this clear distinction: Entropy does *NOT* mean information. Entropy means information density. ...which is just another way of phrasing: probability of new information, if you prefer to use Shannon's term instead. If all the nearby symbols are correlated, then the probability of new

Re: [music-dsp] entropy

2014-10-16 Thread Peter S
On 16/10/2014, Peter S peter.schoffhau...@gmail.com wrote: Let me make this clear distinction: Entropy does *NOT* mean information. Entropy means information density. ...which is just another way of phrasing: probability of new information, if you prefer to use Shannon's term instead. If

Re: [music-dsp] entropy

2014-10-16 Thread Peter S
Saying that we cannot *precisely* estimate entropy because we cannot calculate the correlation between all the bits, is like saying we cannot *precisely* reconstruct a digital signal because the sinc function we need to convolve it with is infinite in both directions, and thus, would need infinite

Re: [music-dsp] entropy

2014-10-16 Thread Alberto di Bene
Just set a filter in my Thunderbird... From now on, all messages on this list having Peter S as originator are directly discarded into the trash bin. 73 Alberto I2PHD -- dupswapdrop -- the music-dsp mailing list and website: subscription info, FAQ, source code archive, list archive, book

Re: [music-dsp] entropy

2014-10-16 Thread Peter S
On 16/10/2014, Alberto di Bene albertodib...@alice.it wrote: Just set a filter in my Thunderbird... From now on, all messages on this list having Peter S as originator are directly discarded into the trash bin.

Re: [music-dsp] entropy

2014-10-16 Thread Phil Burk
On 10/16/14, 3:43 AM, Peter S wrote: Quantifying information is not something that can be discussed in depth in only a dozen messages in a single weekend Very true. Have you considered writing a book on entropy? You clearly can generate a lot of content on a daily basis and could easily

Re: [music-dsp] entropy

2014-10-16 Thread STEFFAN DIEDRICHSEN
On 16 Oct 2014, at 12:36, Peter S peter.schoffhau...@gmail.com wrote: Is that _all_ you can care about? I'm talking about a potential way of categorizing arbitrary data, How should this work? To categorize data you need categories. Therefore you need to understand / interpret data.

Re: [music-dsp] entropy

2014-10-16 Thread Peter S
On 16/10/2014, STEFFAN DIEDRICHSEN sdiedrich...@me.com wrote: How should this work? To categorize data you need categories. I told the two 'categories' that my algorithm distinguishes: 1) noise 2) non-noise Where you put the 'threshold' between noise and non-noise, in other words, how you

Re: [music-dsp] entropy

2014-10-16 Thread Andreas Tell
Peter, On 16 Oct 2014, at 17:29, Peter S peter.schoffhau...@gmail.com wrote: http://media-cache-ec0.pinimg.com/736x/57/e4/6e/57e46edcd3f18dd405db3dd756b6dca0.jpg What I say will only make sense for those people who think. In other words, my words will make sense for only about 2% of the

Re: [music-dsp] entropy

2014-10-16 Thread Peter S
On 16/10/2014, Phil Burk philb...@mobileer.com wrote: On 10/16/14, 3:43 AM, Peter S wrote: Quantifying information is not something that can be discussed in depth in only a dozen messages in a single weekend Very true. Have you considered writing a book on entropy? You clearly can

Re: [music-dsp] entropy

2014-10-16 Thread Peter S
On 16/10/2014, Thomas Strathmann tho...@pdp7.org wrote: But that's not additive is it? For a sequence of length one the algorithm always yields entropy = 0. But for a sequence of length greater than one this need not be the case. Yep, good observation. The number of correlations between N bits

Re: [music-dsp] entropy

2014-10-16 Thread B.J. Buchalter
On Oct 16, 2014, at 12:55 PM, Peter S peter.schoffhau...@gmail.com wrote: Basically I just wanted some 'critique' on this theory, in other words, a logical analysis - maybe someone notices something that I missed. That's all, thanks for your comments. Well, I see one thing that could be an

Re: [music-dsp] entropy

2014-10-16 Thread Peter S
On 16/10/2014, B.J. Buchalter b...@mhlabs.com wrote: Well, I see one thing that could be an issue. If you send your algorithm the following sequence: 010101010101010101010… It will say that that there is one bit of information for each bit it receives. But in this case, that is not true;

Re: [music-dsp] entropy

2014-10-15 Thread Peter S
On 14/10/2014, Peter S peter.schoffhau...@gmail.com wrote: Again, the minimal number of 'yes/no' questions needed to guess your message with 100% probability is _precisely_ the Shannon entropy of the message: Let me demonstrate this using a simple real-world example. Let's imagine that your

Re: [music-dsp] entropy

2014-10-15 Thread Peter S
On 14/10/2014, Ethan Duni ethan.d...@gmail.com wrote: Well, I merely said it's interesting that you can define a measure of information without probabilities at all, if desired. That's a measure of *entropy*, not a measure of information. How you define 'information', is entirely subjective,

Re: [music-dsp] entropy

2014-10-15 Thread Peter S
On 14/10/2014, Max Little max.a.lit...@gmail.com wrote: I haven't seen Hartley entropy used anywhere practical. Hartley entropy is routinely used in cryptography, and usually imply 'equal probability'. This is why I recommended some of you guys take a few basic lessons in cryptography, to have

Re: [music-dsp] entropy

2014-10-15 Thread Peter S
Let me show you the relevance of Hartley entropy another way: Which of the following 10 passwords contains more entropy, and thus better for protecting your account? a) DrhQv7LMbP b) PHGF4V7uod c) ndSk4YrEls d) C38ysVOEDh e) 3XfFmMT13Y f) ayuyR9azD8 g) zuvptYRa1m h) ssptl9pOGt i) KDY2vwqYnV j)

Re: [music-dsp] entropy

2014-10-15 Thread Brendan Jones
On 10/15/2014 01:01 PM, Peter S wrote: Let me show you the relevance of Hartley entropy another way: Here's xkcd's take on password strength. http://xkcd.com/936/ -- dupswapdrop -- the music-dsp mailing list and website: subscription info, FAQ, source code archive, list archive, book

Re: [music-dsp] entropy

2014-10-15 Thread Peter S
It also follows that if the symbol space is binary (0 or 1), then assuming a fully decorrelated and uniformly distributed sequence bits, the entropy per symbol (bit) is precisely log2(2) = 1. From that, it logically follows that an N bit long decorrelated and uniform sequence of bits (= white

Re: [music-dsp] entropy

2014-10-15 Thread Theo Verelst
Before would seemingly agree with some follies going on here: I believe, like I've written for solid reasons, that the normal Information Theory that led to a theoretical underpinning of various interesting EE activities since long ago, is solidly understood by it's makers, and when rightly

Re: [music-dsp] entropy

2014-10-15 Thread Peter S
Let me express the Hartley entropy another way: The Hartley entropy gives the size of the symbol space, so it is a good approximator and upper bound for the actual entropy. If the symbols are fully decorrelated, then the _maximum_ amount of time it takes to search through the entire symbol space

Re: [music-dsp] entropy

2014-10-15 Thread Peter S
Let me express the Hartley entropy another way: The Hartley entropy gives the size of the symbol space, so it is a good approximator and upper bound for the actual entropy. If the symbols are fully decorrelated, then the _maximum_ amount of time it takes to search through the entire symbol space

Re: [music-dsp] entropy

2014-10-15 Thread Peter S
On 15/10/2014, Theo Verelst theo...@theover.org wrote: Like why would professionally self-respecting scientists need to worry about colleagues as to use 20 character passwords based on analog random data? Once all your money from your bank account gets stolen and goes up in smoke, you'll

Re: [music-dsp] entropy

2014-10-15 Thread Peter S
On 15/10/2014, Theo Verelst theo...@theover.org wrote: Like why would professionally self-respecting scientists need to worry about colleagues as to use 20 character passwords based on analog random data? FYI: When I communicate with my bank, before logging in, I have to move my mouse randomly

Re: [music-dsp] entropy

2014-10-15 Thread Peter S
Okay, let's phrase it this way - what I essentially showed is that the 'Shannon entropy' problem can be turned into a 'symbol space search' problem, where the entropy inversely correlates with the probability of finding the solution in the problem space. Often, we don't care *precisely* what the

Re: [music-dsp] entropy

2014-10-15 Thread Phil Burk
Hello Peter, I'm trying to understand this entropy discussion. On 10/15/14, 2:08 AM, Peter S wrote: Let's imagine that your message is 4 bits long, If we take the minimal number of 'yes/no' questions I need to guess your message with a 100% probability, and take the base 2 logarithm of

Re: [music-dsp] entropy

2014-10-15 Thread Peter S
On 15/10/2014, Phil Burk philb...@mobileer.com wrote: That would take 16 questions. But instead of asking those 16 questions, why not ask: Is the 1st bit a 1? Is the 2nd bit a 1? Is the 3rd bit a 1? Is the 4th bit a 1? Good question! In practical context, that's impossible, normally we

Re: [music-dsp] entropy

2014-10-15 Thread rbj
�Peter S peter.schoffhau...@gmail.com wrote: Okay, let's phrase it this way - what I essentially showed is that the 'Shannon entropy' problem can be turned into a 'symbol space search' problem, where the entropy inversely correlates with the probability of finding the solution in the

Re: [music-dsp] entropy

2014-10-15 Thread Peter S
Let me show another way, why the total amount of information in a system is expected to correlate with the total amount of decorrelation in the system. Let's imagine we have a simple information system, with two pieces of information. Say, information A, and information B. Let's imagine, these

Re: [music-dsp] entropy

2014-10-15 Thread Peter S
On 15/10/2014, r...@audioimagination.com r...@audioimagination.com wrote: sorry, Peter, but we be unimpressed. I gave you a practical, working *algorithm*, that does *something*. In my opinion, it (roughly) approximates 'expected entropy', and I found various practical real-world uses of this

Re: [music-dsp] entropy

2014-10-15 Thread Peter S
On 15/10/2014, Peter S peter.schoffhau...@gmail.com wrote: So it seems, I'm not the only one on this planet, who thinks _exactly_ this way. Therefore, I think your argument is invalid, or all the other people who wrote those scientific entropy estimation papers are _all_ also crackpots. (*)

Re: [music-dsp] entropy

2014-10-15 Thread Peter S
...couldn't you throw those information theory books aside for a few moments, and start thinking about information theory concepts with a fresh mind for some moments with an out of the box approach? Again, I'm not here to argue about whose religion is the best. Rather, I'm trying to quantify

Re: [music-dsp] entropy

2014-10-15 Thread Peter S
...and we didn't even go into 'entropy of algorithms' and other fun topics... -- dupswapdrop -- the music-dsp mailing list and website: subscription info, FAQ, source code archive, list archive, book reviews, dsp links http://music.columbia.edu/cmc/music-dsp

Re: [music-dsp] entropy

2014-10-15 Thread Peter S
Notice that in my system of symbol A and symbol B, I can still quantify the amount of information based on the size of the symbol space using the Hartley entopy H_0 without needing to know the _actual_ probability distributions, because I can estimate the *expected* entropy based the size of the

Re: [music-dsp] entropy

2014-10-15 Thread Peter S
Academic person: There is no way you could do it! Impossibru!! Practical person: Hmm... what if I used a simple upper-bound approximation instead? -- dupswapdrop -- the music-dsp mailing list and website: subscription info, FAQ, source code archive, list archive, book reviews, dsp links

Re: [music-dsp] entropy

2014-10-15 Thread Alan Wolfe
For some reason, All I'm seeing are your emails Peter. not sure who you are chatting to or what they are saying in response :P On Wed, Oct 15, 2014 at 2:18 PM, Peter S peter.schoffhau...@gmail.com wrote: Academic person: There is no way you could do it! Impossibru!! Practical person: Hmm...

Re: [music-dsp] entropy

2014-10-15 Thread Peter S
Nevermind. Was just trying to find out how we could characterize some arbitrary data. Apparently all that some guys see from this, is that n! that does NOT fit into my world view!! -- dupswapdrop -- the music-dsp mailing list and website: subscription info, FAQ, source code archive, list

Re: [music-dsp] entropy

2014-10-15 Thread Paul Stoffregen
On 10/15/2014 12:45 PM, Peter S wrote: I gave you a practical, working *algorithm*, that does *something*. In the 130 messages you've posted since your angry complaint regarding banishment from an IRC channel nearly 2 weeks ago, I do not recall seeing any source code, nor any psuedo-code,

Re: [music-dsp] entropy

2014-10-14 Thread Peter S
On 14/10/2014, ro...@khitchdee.com ro...@khitchdee.com wrote: Peter, How would you characterize the impact of your posts on the entropy of this mailing list, starting with the symbol space that get's defined by the different perspectives on entropy :-) I merely showed that: 1) 'entropy'

Re: [music-dsp] entropy

2014-10-14 Thread Peter S
Again, the minimal number of 'yes/no' questions needed to guess your message with 100% probability is _precisely_ the Shannon entropy of the message: For the case of equal probabilities (i.e. each message is equally probable), the Shannon entropy (in bits) is just the number of yes/no questions

Re: [music-dsp] entropy

2014-10-14 Thread Peter S
Another way of expressing what my algorithm does: it estimates 'decorrelation' in the message by doing a simple first-order approximation of decorrelation between bits. The more random a message is, the more decorrelated their bits are. Otherwise, if the bits are correlated, that is not random and

Re: [music-dsp] entropy

2014-10-14 Thread Max Little
Longest discussion thread so far I think! The discussion reminded me of more general measures of entropy than Shannon's, examples are the Renyi entropies: http://en.wikipedia.org/wiki/R%C3%A9nyi_entropy Some might find it amusing and relevant to this discussion that the 'Hartley entropy' H_0 is

Re: [music-dsp] entropy

2014-10-14 Thread Peter S
So, instead of academic hocus-pocus and arguing about formalisms, what I'm rather concerned about is: - What are the real-world implications of the Shannon entropy problem? - How could we possibly use this to categorize arbitrary data? -- dupswapdrop -- the music-dsp mailing list and website:

Re: [music-dsp] entropy

2014-10-14 Thread Peter S
On 14/10/2014, Max Little max.a.lit...@gmail.com wrote: P.S. Any chance people could go offline for this thread now please? It's really jamming up my inbox and I don't want to unsubscribe ... Any chance your mailbox has the possibility of setting up a filter that moves messages with the

Re: [music-dsp] entropy

2014-10-14 Thread Peter S
On 14/10/2014, Max Little max.a.lit...@gmail.com wrote: Some might find it amusing and relevant to this discussion that the 'Hartley entropy' H_0 is defined as the base 2 log of the cardinality of the sample space of the random variable ... Which implies, that if the symbol space is binary (0

Re: [music-dsp] entropy

2014-10-14 Thread Peter S
Which is another way of saying: a fully decorrelated sequence of bits has the maximum amount of entropy. So if we try to estimate the 'decorrelation' (randomness) in the signal, then we can estimate 'entropy'. -- dupswapdrop -- the music-dsp mailing list and website: subscription info, FAQ,

Re: [music-dsp] entropy

2014-10-14 Thread Max Little
Well, it just says that there is a measure of information for which the actual distribution of symbols is (effectively) irrelevant. Which is interesting in it's own right ... Max On 14 October 2014 11:59, Peter S peter.schoffhau...@gmail.com wrote: On 14/10/2014, Max Little

Re: [music-dsp] entropy

2014-10-14 Thread Peter S
On 14/10/2014, Max Little max.a.lit...@gmail.com wrote: Well, it just says that there is a measure of information for which the actual distribution of symbols is (effectively) irrelevant. Which is interesting in it's own right ... Feel free to think outside the box. Welcome to the real world,

Re: [music-dsp] entropy

2014-10-14 Thread Max Little
Prescient. Apparently, Kolmogorov sort of, perhaps, agreed: Discussions of information theory do not usually go into this combinatorial approach [that is, the Hartley function] at any length, but I consider it important to emphasize its logical independence of probabilistic assumptions, from Three

Re: [music-dsp] entropy

2014-10-14 Thread Peter S
On 14/10/2014, Sampo Syreeni de...@iki.fi wrote: We do know this stuff. We already took the red pill, *ages* ago. Peter's problem appears to be that he's hesitant to take the plunge into the math, proper. Starting with the basics Didn't you recently tell us that you have no clue of 'entropy

Re: [music-dsp] entropy

2014-10-14 Thread Ethan Duni
Although, it's interesting to me that you might be able to get some surprising value out of information theory while avoiding any use of probability ... Hartley entropy doesn't avoid any use of probability, it simply introduces the assumption that all probabilities are uniform which greatly

Re: [music-dsp] entropy

2014-10-14 Thread Max Little
Hartley entropy doesn't avoid any use of probability, it simply introduces the assumption that all probabilities are uniform which greatly simplifies all of the calculations. How so? It's defined as the log cardinality of the sample space. It is independent of the actual distribution of the

Re: [music-dsp] entropy

2014-10-14 Thread Sampo Syreeni
On 2014-10-14, Max Little wrote: Hartley entropy doesn't avoid any use of probability, it simply introduces the assumption that all probabilities are uniform which greatly simplifies all of the calculations. How so? It's defined as the log cardinality of the sample space. It is independent

Re: [music-dsp] entropy

2014-10-14 Thread Max Little
Right, and that is exactly equivalent to using Shannon entropy under the assumption that the distribution is uniform. Well, we'd probably have to be clearer about that. The Hartley entropy is invariant to the actual distribution (provided all the probabilities are non-zero, and the sample space

Re: [music-dsp] entropy

2014-10-14 Thread Max Little
Hartley entropy doesn't avoid any use of probability, it simply introduces the assumption that all probabilities are uniform which greatly simplifies all of the calculations. How so? It's defined as the log cardinality of the sample space. It is independent of the actual distribution of the

Re: [music-dsp] entropy

2014-10-14 Thread Sampo Syreeni
On 2014-10-14, Max Little wrote: Hmm .. don't shoot the messenger! I merely said, it's interesting that you don't actually have to specify the distribution of a random variable to compute the Hartley entropy. No idea if that's useful. Math always has this precise tradeoff: more general but

Re: [music-dsp] entropy

2014-10-14 Thread Theo Verelst
Max Little wrote: ... Well, we'd probably have to be clearer about that. The Hartley entropy is invariant to the actual distribution Without going into the comparison of wanting to be able to influence the lottery to achieve a higher winning chance, I looked up the Shannon/Hartley theorem,

Re: [music-dsp] entropy

2014-10-14 Thread Ethan Duni
The Hartley entropy is invariant to the actual distribution (provided all the probabilities are non-zero, and the sample space remains unchanged). No, the sample space does not require that any probabilities are nonzero. It's defined up-front, independently of any probability distribution.

Re: [music-dsp] entropy

2014-10-14 Thread Max Little
OK yes, 0^0 = 1. Delete the bit about probabilities needing to be non-zero I guess! Think you're taking what I said too seriously, I just said it's an interesting formula! Kolmogorov seemed to think so too. M. On 14 October 2014 18:37, Ethan Duni ethan.d...@gmail.com wrote: The Hartley entropy

Re: [music-dsp] entropy

2014-10-14 Thread Max Little
Yes, don't have time for a long answer, but all elegantly put. I'm just reiterating the formula. And saying it's interesting. Maths is really just patterns, lots of them are interesting to me, regardless of whether there is any other extrinsic 'meaning' to those patterns. M. On 14 October 2014

Re: [music-dsp] entropy

2014-10-14 Thread Sampo Syreeni
On 2014-10-14, Max Little wrote: Maths is really just patterns, lots of them are interesting to me, regardless of whether there is any other extrinsic 'meaning' to those patterns. In that vein, it might even be the most humanistic of sciences. Moreso even than poetry:

Re: [music-dsp] entropy

2014-10-14 Thread Max Little
If you look at the real audio signals out there, which statistic would you expect them to follow under the Shannonian framework? A flat one? Or alternatively, what precise good would it do to your analysis, or your code, if you went with the equidistributed, earlier, Hartley framework? Would

Re: [music-dsp] entropy

2014-10-14 Thread Ethan Duni
The relevant limit here is: lim x*log(x) = 0 x-0 It's pretty standard to introduce a convention of 0*log(0) = 0 early on in information theory texts, since it avoids a lot of messy delta/epsilon stuff in the later exposition (and since the results cease to make sense without it, with empty

Re: [music-dsp] entropy

2014-10-13 Thread Peter S
Again, to understand what 'entropy estimation' is, please feel free to consult the scientific literature list I posted. If your mind is poisoned by academic books so much that you're unable think 'outside the box' any longer, then you will never grasp this concept. -- dupswapdrop -- the music-dsp

Re: [music-dsp] entropy

2014-10-13 Thread Rohit Agarwal
and in a slightly different era. This like a trek in the space of ideas.   From:Peter S peter.schoffhau...@gmail.com Sent:A discussion list for music-related DSP music-dsp@music.columbia.edu Date:Mon, October 13, 2014 12:32 pm Subject:Re: [music-dsp

Re: [music-dsp] entropy

2014-10-13 Thread Peter S
On 13/10/2014, r...@audioimagination.com r...@audioimagination.com wrote: What was claimed: number of binary transitions _correlates_ with entropy (statistically) it's a mistaken claim, Peter. in case you hadn't gotten it, you're getting a bit outa your league. there are some very sharp and

Re: [music-dsp] entropy

2014-10-13 Thread Peter S
Let's imagine that I'm a banker, and you open a savings account in my bank, to put your life savings into my bank. So you send your life savings to me, say you send me $1,000,000 dollars. When you want to access your money, we communicate via messages. You have a secret password, and you send it

Re: [music-dsp] entropy

2014-10-13 Thread Peter S
...which implies, that the Shannon entropy problem is - on one level - a guessing game problem - What is the minimal amount of 'yes/no' questions to guess your data with 100% probability? The more random your data is, the harder it is to guess. -- dupswapdrop -- the music-dsp mailing list and

Re: [music-dsp] entropy

2014-10-13 Thread Peter S
n 13/10/2014, Peter S peter.schoffhau...@gmail.com wrote: When we try to estimate the entropy of your secret, we cannot _precisely_ know it. (*) (*) ... except when none of your symbols are correlated, for example when they come from a cryptographically secure random number generator. In that

Re: [music-dsp] entropy

2014-10-13 Thread Rohit
Peter, How would you characterize the impact of your posts on the entropy of this mailing list, starting with the symbol space that get's defined by the different perspectives on entropy :-) -- dupswapdrop -- the music-dsp mailing list and website: subscription info, FAQ, source code archive,

Re: [music-dsp] entropy

2014-10-12 Thread Peter S
On 11/10/2014, r...@audioimagination.com r...@audioimagination.com wrote: all decompression is is decoding. you have tokens (usually binary bits or a collection of bits) and a code book (this is something that you need to understand regarding Huffman or entropy coding), you take the token and

Re: [music-dsp] entropy

2014-10-12 Thread Peter S
On 12/10/2014, Peter S peter.schoffhau...@gmail.com wrote: What I'm trying to find out is: - What is the entropy distribution (information distribution) of the message? - Where _exactly_ is the entropy (information) located in the message? - Could that entropy be extracted or estimated

Re: [music-dsp] entropy

2014-10-12 Thread Peter S
On 12/10/2014, Peter S peter.schoffhau...@gmail.com wrote: To me, this message can be clearly separated into three distinct parts: 000 - almost no information, all zeros 1001011001101 - lots of information (lots of entropy) 0 - almost no

Re: [music-dsp] entropy

2014-10-12 Thread Richard Dobson
On 12/10/2014 11:31, Peter S wrote: On 12/10/2014, Peter S peter.schoffhau...@gmail.com wrote: To me, this message can be clearly separated into three distinct parts: 000 - almost no information, all zeros 1001011001101 - lots of information (lots of entropy)

Re: [music-dsp] entropy

2014-10-12 Thread Richard Wentk
None at all, because Shannon only makes sense if you define your symbols first, or define the explicit algorithm used to specify symbols. Relying on human pattern recognition skills to say 'oh look, here's a repeating bit pattern' says nothing useful about Shannon entropy. The whole point

Re: [music-dsp] entropy

2014-10-12 Thread padawa...@obiwannabe.co.uk
This is covered by Quine and Shannon, although I cannot cite you chapter and verse. Basically, you are correct. A message alone is only half of the story: relative to some pre-agreed decoder matrix defining lowest entropy. (Contrast to chemical entropy which has a fixed baseline set by physics of

Re: [music-dsp] entropy

2014-10-12 Thread Peter S
On 12/10/2014, Richard Wentk rich...@wentk.com wrote: Relying on human pattern recognition skills to say 'oh look, here's a repeating bit pattern' says nothing useful about Shannon entropy. The whole point of Shannon analysis is that it's explicit, completely defined, robust, and

  1   2   >