On 14/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
Again, the minimal number of 'yes/no' questions needed to guess your
message with 100% probability is _precisely_ the Shannon entropy of
the message:
Let me demonstrate this using a simple real-world example.
Let's imagine that your
On 14/10/2014, Ethan Duni ethan.d...@gmail.com wrote:
Well, I merely said it's interesting that you can define a measure of
information without probabilities at all, if desired.
That's a measure of *entropy*, not a measure of information.
How you define 'information', is entirely subjective,
On 14/10/2014, Max Little max.a.lit...@gmail.com wrote:
I haven't seen Hartley entropy used anywhere practical.
Hartley entropy is routinely used in cryptography, and usually imply
'equal probability'.
This is why I recommended some of you guys take a few basic lessons in
cryptography, to have
Hi,
Just a reminder that if you are new to the list you should read the
music-dsp FAQ. It contains answers to both technical _and_
adminstrative questions that often come up on the list. If your question
appears in the FAQ it is safe to assume that it has been discussed on the
list many times in
Let me show you the relevance of Hartley entropy another way:
Which of the following 10 passwords contains more entropy, and thus
better for protecting your account?
a) DrhQv7LMbP
b) PHGF4V7uod
c) ndSk4YrEls
d) C38ysVOEDh
e) 3XfFmMT13Y
f) ayuyR9azD8
g) zuvptYRa1m
h) ssptl9pOGt
i) KDY2vwqYnV
j)
On 10/15/2014 01:01 PM, Peter S wrote:
Let me show you the relevance of Hartley entropy another way:
Here's xkcd's take on password strength.
http://xkcd.com/936/
--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book
It also follows that if the symbol space is binary (0 or 1), then
assuming a fully decorrelated and uniformly distributed sequence bits,
the entropy per symbol (bit) is precisely log2(2) = 1.
From that, it logically follows that an N bit long decorrelated and
uniform sequence of bits (= white
Before would seemingly agree with some follies going on here: I
believe, like I've written for solid reasons, that the normal
Information Theory that led to a theoretical underpinning of various
interesting EE activities since long ago, is solidly understood by it's
makers, and when rightly
Let me express the Hartley entropy another way:
The Hartley entropy gives the size of the symbol space, so it is a
good approximator and upper bound for the actual entropy. If the
symbols are fully decorrelated, then the _maximum_ amount of time it
takes to search through the entire symbol space
Let me express the Hartley entropy another way:
The Hartley entropy gives the size of the symbol space, so it is a
good approximator and upper bound for the actual entropy. If the
symbols are fully decorrelated, then the _maximum_ amount of time it
takes to search through the entire symbol space
On 15/10/2014, Theo Verelst theo...@theover.org wrote:
Like why would professionally self-respecting
scientists need to worry about colleagues as to use 20 character
passwords based on analog random data?
Once all your money from your bank account gets stolen and goes up in
smoke, you'll
On 15/10/2014, Theo Verelst theo...@theover.org wrote:
Like why would professionally self-respecting
scientists need to worry about colleagues as to use 20 character
passwords based on analog random data?
FYI: When I communicate with my bank, before logging in, I have to
move my mouse randomly
Okay, let's phrase it this way - what I essentially showed is that the
'Shannon entropy' problem can be turned into a 'symbol space search'
problem, where the entropy inversely correlates with the probability
of finding the solution in the problem space.
Often, we don't care *precisely* what the
Hello Peter,
I'm trying to understand this entropy discussion.
On 10/15/14, 2:08 AM, Peter S wrote:
Let's imagine that your message is 4 bits long,
If we take the minimal number of 'yes/no' questions I need to guess
your message with a 100% probability, and take the base 2 logarithm of
On 15/10/2014, Phil Burk philb...@mobileer.com wrote:
That would take 16 questions. But instead of asking those 16 questions,
why not ask:
Is the 1st bit a 1?
Is the 2nd bit a 1?
Is the 3rd bit a 1?
Is the 4th bit a 1?
Good question! In practical context, that's impossible, normally we
�Peter S peter.schoffhau...@gmail.com wrote:
Okay, let's phrase it this way - what I essentially showed is that the
'Shannon entropy' problem can be turned into a 'symbol space search'
problem, where the entropy inversely correlates with the probability
of finding the solution in the
Let me show another way, why the total amount of information in a
system is expected to correlate with the total amount of
decorrelation in the system.
Let's imagine we have a simple information system, with two pieces of
information. Say, information A, and information B. Let's imagine,
these
On 15/10/2014, r...@audioimagination.com r...@audioimagination.com wrote:
sorry, Peter, but we be unimpressed.
I gave you a practical, working *algorithm*, that does *something*.
In my opinion, it (roughly) approximates 'expected entropy', and I
found various practical real-world uses of this
On 15/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
So it seems, I'm not the only one on this planet, who thinks _exactly_
this way. Therefore, I think your argument is invalid, or all the
other people who wrote those scientific entropy estimation papers are
_all_ also crackpots. (*)
...couldn't you throw those information theory books aside for a few
moments, and start thinking about information theory concepts with a
fresh mind for some moments with an out of the box approach?
Again, I'm not here to argue about whose religion is the best.
Rather, I'm trying to quantify
...and we didn't even go into 'entropy of algorithms' and other fun topics...
--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp
links
http://music.columbia.edu/cmc/music-dsp
Notice that in my system of symbol A and symbol B, I can still
quantify the amount of information based on the size of the symbol
space using the Hartley entopy H_0 without needing to know the
_actual_ probability distributions, because I can estimate the
*expected* entropy based the size of the
Academic person: There is no way you could do it! Impossibru!!
Practical person: Hmm... what if I used a simple upper-bound
approximation instead?
--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp
links
For some reason, All I'm seeing are your emails Peter. not sure who you
are chatting to or what they are saying in response :P
On Wed, Oct 15, 2014 at 2:18 PM, Peter S peter.schoffhau...@gmail.com
wrote:
Academic person: There is no way you could do it! Impossibru!!
Practical person: Hmm...
Nevermind. Was just trying to find out how we could characterize some
arbitrary data.
Apparently all that some guys see from this, is that n! that
does NOT fit into my world view!!
--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list
On 10/15/2014 12:45 PM, Peter S wrote:
I gave you a practical, working *algorithm*, that does *something*.
In the 130 messages you've posted since your angry complaint regarding
banishment from an IRC channel nearly 2 weeks ago, I do not recall
seeing any source code, nor any psuedo-code,
26 matches
Mail list logo