On 17/02/2014, Theo Verelst theo...@theover.org wrote:
In short, I can play/enhance in thhe 192/24 domain,, and then switch
over to 44.1/16, and hear the difference easily, any day of the week.
End of analyisis.
Isn't that difference caused by
1) 192-44.1 kHz resampling (which is not trivial,
On 17/02/2014, Theo Verelst theo...@theover.org wrote:
And that has been my exact point the whole time, it's deluding people
into a perfection which isn't that, perfection.
I think the video doesn't suggest 'perfection'. It clearly states that
smaller bit depth means more noise. But since the
On 17/02/2014, Richard Dobson richarddob...@blueyonder.co.uk wrote:
At the atomic level, the best optical mirror in the
world is not perfectly flat, because the atoms themselves aren't.
Unless one builds a mirror using nanotechnology, atom-by-atom, forming
a perfect molecular grid :)
(Which
On 17/02/2014, Theo Verelst theo...@theover.org wrote:
and I am sure CD and High
Definition tracks can be AD-converted at high quality from master tapes,
no reason to presume some ridiculous 13 bits max in general, even if
there's reason to explain certain limitations.
I assume his numbers
On 18/02/2014, Charles Z Henry czhe...@gmail.com wrote:
Basically, we know that magnetic patterns on tape do not work exactly like
bits. They also do not resemble exactly the analog signal they were
recorded with. The playback signal is the result of lots of small magnetic
bumps of varying
On 18/02/2014, Theo Verelst theo...@theover.org wrote:
i hope
so, because I'm quitting this discussion, I want to progress, not teach
half the world of so-so signal processors the foundations of EE,
university level.
I'm sure the world would be happy if you did a better introductory DAC
On 18/02/2014, Andrew Simper a...@cytomic.com wrote:
Well current thinking (as in for over 10 years) is 1-bit isn't
enough because you end up not being able to do correct noise shaping:
Sounds reasonable. Seems like 1 bit DACs were sometimes common, but
that's no longer the case.
Here's an
Hello Theo,
On 27/02/2014, Theo Verelst theo...@theover.org wrote:
http://www.youtube.com/watch?v=5snh9UvAno4feature=youtu.be
May I ask what kind of software do you use at 0:18? I couldn't
recognize it. What are those graph processors actually doing, and what
is their role for this
Checked the video again, so seems like you have some signal (music),
then you process that through some modular graph processor (maybe
something FFT-based?), plus (?) some hardware processor(s) (reverb?),
and then the two signals differ in the 2-4k range.
I'm not sure, what's that supposed to
Okay, so in a nutshell you are doing de-mastering and re-mastering on
a track (if I understand correctly).
It's still not clear, what is the conclusion from all this?
- Peter
--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book
On 06/03/2014, Charles Z Henry czhe...@gmail.com wrote:
1) Steep filter to isolate speech (100-4k?).
Not a great idea over all. You might just mangle the audio a bit more. It
loses some temporal qualities when filtered too much. Listening to speech
relies a lot on temporal cues that you
On 30/03/2014, Didier Dambrin di...@skynet.be wrote:
You keep saying that everyone seems to agree, but I'd still like to hear
just one simple example of that, in a wav file. Until I've heard it, I'll
keep saying that dithering to 16bit is pointless.
I wish I could show you a sample of that
Hi Everyone,
I was told that my invitation contained more personal information than
it should, thus it will be removed. So, here's the short version of it
again:
You are invited to the #music-dsp IRC chatroom on EFNet (with a dash
in the middle, as opposed to the orginal #musicdsp).
tl;dr
On 06/10/2014, Charles Z Henry czhe...@gmail.com wrote:
--- SO at what levels are sounds represented like a Fourier Transform:
1. The cochlea--for each frequency, there is a point along the
cochlea where the basilar membrane has its largest displacement. The
inner hair cells are most
Jon,
On 06/10/2014, Jon Boley j...@jboley.com wrote:
In the hearing science community, it is well-established that the cochlea
acts as a filterbank (via the resonances that you mentioned) and each
auditory nerve fiber responds to sounds within a limited frequency range.
Thanks for confirming
On 07/10/2014, Theo Verelst theo...@theover.org wrote:
I'm fine with someone taking an FFT as a filter or filter bank
It _is_ a filter bank, literally. Each FFT/DFT bin is like an
individual bandpass filter. The FFT spectrum is the sum of these
individual, overlapping bandpass filters.
Graphed
On 07/10/2014, Zhiguang Zhang ericzh...@gmail.com wrote:
The FFT relates
to a ‘filter’ in a way in which you can digitally reconstruct the original
frequency by picking out a magnitude bin and doing an inverse FFT. That way
you can get a sine tone back.
Couldn't we still call it an 'analysis
On 07/10/2014, Zhiguang Zhang ericzh...@gmail.com wrote:
The view of the windowing function having bandpass and cutoff regions is
misleading. The windowing function is in the time domain, whereas filters
operate with a frequency response in the frequency domain.
And doesn't the time domain
On 07/10/2014, Sampo Syreeni de...@iki.fi wrote:
Talking about filterbanks implicitly says that all of the filters are
somehow structurally the same, and linear.
Not for me. That probably depends on how you define the word 'filterbank'.
I looked up Wikipedia, which defines Filter bank as:
In
On 07/10/2014, Jon Boley j...@jboley.com wrote:
What I have done is take several auditory nerve responses, bandpass-filter
them, and add them all up.
It sounds like a vocoded version of the original. Even with just a few
channels, it is easy to understand speech that has been processed in this
On 07/10/2014, robert bristow-johnson r...@audioimagination.com wrote:
What I have done is take several auditory nerve responses, bandpass-filter
them, and add them all up.
where did you get them?
were all 3 auditory nerve fibers sampled? (i sorta doubt it.)
how did they measure these
On 07/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
Irregardless of Wikipedia, for me the word 'filterbank' does not imply
linearity How is that even meant? As - linearly spaced in frequency,
or as f(a)+f(b)=f(a+b)?
And if he meant the latter, then say I take a 'linear' filterbank
On 07/10/2014, Bjorn Roche bj...@xowave.com wrote:
I was just on a call with someone who researches hearing and
psychoacoustics. He happened to mention gamma tone filters, which I had
never heard of. I may have misunderstood, since it was a tangent, but I
believe he said it's a commonly used
On 08/10/2014, r...@audioimagination.com r...@audioimagination.com wrote:
there is actually a difference between digital signals and discrete-time
signals. not the same thing. but with a sufficiently high sample rate,
you can reasonably simulate a
continuous-time signal and the system
On 08/10/2014, Theo Verelst theo...@theover.org wrote:
What is the point of saying a transducer has filtering in it?
Knowing exactly what type of filtering is going on, can help one
understand the workings of the transducer/system.
ALso, the idea of neurons processing the information is a bit
On 08/10/2014, r...@audioimagination.com r...@audioimagination.com wrote:
maybe before you do, check out that Bob Adams paper to make sure you're not
preaching the same sermon from 17 years ago to the choir.
Do you have a link to it?
--
dupswapdrop -- the music-dsp mailing list and website:
On 08/10/2014, r...@audioimagination.com r...@audioimagination.com wrote:
but again, compare the total number of neural impulses per second and the
number of bits per second flying at you with high-quality audio. there is
an information reduction going on there.
By chance, have you made some
Further parallels between [A] analog-to-digital conversion and [B]
pulse-rate based neural audiotory encoding:
1) Both encode a continuous signal:
---[A] encoding an analog electric signal
---[B] encoding a waveform on a membrane (*)
2) Both encode the signal in discrete form:
---[A] encoding
On 08/10/2014, r...@audioimagination.com r...@audioimagination.com wrote:
i'm not a biologist nor a physiologist. i am only repeating stuff i
remember from a fascinating presentation at the IEEE Mohonk conference in
1997. i've thought that it was about 100 or fewer firings per second when
The reason why there is no correlation between the time-domain PCM
entropy and the rate of neural firing is this:
One works in the time domain, another works in the frequency domain.
No direct correlation. We mostly agreed that the cochlea acts as a
filterbank, creating a frequency-amplitude
On 09/10/2014, mads dyrholm misterm...@gmail.com wrote:
All the ear/neurons have to do is project the stimulus onto long term memory
- Granular synthesis if you will. An entire symphony could in principle be
perceived from a single bit (the bit that says PLAY).
This is an interesting
On 09/10/2014, r...@audioimagination.com r...@audioimagination.com wrote:
1) The binary entropy of both PCM sine waves is just about the same -
the amplitude of a sinusoidal partial of a signal does not directly
affect the binary entropy. Both PCM sound files are the same size, and
contain
For these reasons, I think digital entropy in the classical sense has
little direct relevance for our neural processes.
Another thought experiment: compare what happens when listening to a
sine wave and a (non-bandlimited) square wave of the same amplitude.
The square wave has alot more
On 09/10/2014, Charles Z Henry czhe...@gmail.com wrote:
Your thought experiments are fine, but you're clearly just feeling out
how to define entropy for audio signals.
Since that's what r b-j asked :)
All I did is try to test this analytically.
It's *not* a well defined problem.
Exactly,
On 09/10/2014, Ethan Duni ethan.d...@gmail.com wrote:
Amplitude has a direct, strong relationship to signal entropy (not
information, which is a property of pairs of random variables).
Unless it is a non-bandlimited (naive) square wave.
In that case, that claim is absolutely not true.
That's
On 09/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
On 09/10/2014, Ethan Duni ethan.d...@gmail.com wrote:
Amplitude has a direct, strong relationship to signal entropy (not
information, which is a property of pairs of random variables).
Unless it is a non-bandlimited (naive) square
On 09/10/2014, Ethan Duni ethan.d...@gmail.com wrote:
Amplitude has a direct, strong relationship to signal entropy (not
information, which is a property of pairs of random variables).
Let's assume I have a sinusoidal signal.
Let's assume I amplify it to 10x.
Where does new entropy come from?
On 09/10/2014, Ethan Duni ethan.d...@gmail.com wrote:
You need way more than 1 bit to represent any square wave
Correction: 1 bit PER SAMPLE (either 1 or 0, hi or low - a naive
square only has twose two states...)
(I thought that was trivial that I meant that)
--
dupswapdrop -- the music-dsp
On 09/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
Let's assume I have a sinusoidal signal.
Let's assume I amplify it to 10x.
Where does new entropy come from?
I make an even better example:
Lets assume I amplify the signal by a power of two ( x2, x4, x8, x16 etc. )
Assuming integer
On 09/10/2014, Charles Z Henry czhe...@gmail.com wrote:
See all I hear you keep arguing is about bits and quantization...
Exactly my point - It's pretty flawed to talk about entropy of bits
here, because as soon as the digital signal leaves your sound card's
D/A converter, you no longer have
On 09/10/2014, Sampo Syreeni de...@iki.fi wrote:
So, actually, when you talk about entropy, you ought to define the model
it's calculated against,
The entopy estimation I assumed was the amount of transitions (either
1-0 or 0-1) in the binary numerical representation of the signal (I
thought
On 09/10/2014, Ethan Duni ethan.d...@gmail.com wrote:
Let's assume I have a sinusoidal signal.
Let's assume I amplify it to 10x.
Where does new entropy come from?
It comes from the amplification.
What is your entropy model?
So you aren't talking about literal sine waves then, you're talking
On 09/10/2014, Sampo Syreeni de...@iki.fi wrote:
Which of course brings us back to your very point: Peter really should
understand the basics of information theory before applying it.
Could you offer me a reading that you think would clarify the concepts
that you think I applied in an improper
n 09/10/2014, Ethan Duni ethan.d...@gmail.com wrote:
I did not claim anything about entropy of
continuous signals,
Aren't we talking about impulses in auditory nerves (among other things)?
Those things live in the analog domain.
Nerves fire discrete impulses, so those are definitely not
On 10/10/2014, Theo Verelst theo...@theover.org wrote:
[...] I fail to see the point.
all I meant:
1) Entropy can be estimated (and gave an example of that)
2) Entropy can be extracted (and gave an example of that)
--
dupswapdrop -- the music-dsp mailing list and website:
subscription info,
On 10/10/2014, Ethan Duni ethan.d...@gmail.com wrote:
I'm calling them analog because they are obviously, unequivocably
continuous analog signals.
What do you think the relevance of that is, from the point of view of
transmitting neural signals?
Do you think that if they were not 'continuous',
On 10/10/2014, Ethan Duni ethan.d...@gmail.com wrote:
Your entropy estimator does not estimate Shannon entropy.
Exactly. Which was never claimed in the first place.
You said: You cannot estimate entropy of arbitrary signals!
I said: I can, here is an example.
I gave a function that gives a
On 11/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
(I think we agree that the 'entropy' content of a
signal in the strict sense means the minimal number of bits that it
can be used to represent it).
... and that always depends on how we're representing signals. Are we
using
On 11/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
In essence, this is another way of saying: constant parts do not add
anything to the entropy, the entropy is contained in the transitions. (*)
(*) ...this is not strictly true, the length of the constant part also
contain some
Academic person: We cannot _precisely_ calculate entropy, because we
cannot know and calculate the entire timeline of the universe!
Practical person: Hmm... What if we tried to roughly estimate entropy
with a simple formula instead...
--
dupswapdrop -- the music-dsp mailing list and website:
Here's another, practical way of estimating binary (Shannon) entropy
content of an arbitrary digital signal:
Compress it with PKZIP, and check the resulting file size in bits.
Compression ratio will inversely correlate with the Shannon entropy
content of the signal, low entropy signals being
On 11/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
32A
16A16B
8A8B8A8B
4A4B4A4B4A4B4A4B
2A2B2A2B2A2B2A2B2A2B2A2B2A2B2A2B
1A1B1A1B1A1B1A1B1A1B1A1B1A1B1A1B1A1B1A1B1A1B1A1B1A1B1A1B1A1B1A1B
Probably I should rather write that in binary form instead of decimal:
10A
1A1B
On 11/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
On 10/10/2014, Ethan Duni ethan.d...@gmail.com wrote:
Your entropy estimator does not estimate Shannon entropy.
Maybe a better formula would be...
number of binary transitions, plus sum of the log2 of the length of
constant parts?
Feel
On 11/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
Maybe a better formula would be...
number of binary transitions, plus sum of the log2 of the length of
constant parts?
Let's test this formula on the original data:
- 0 + 5 = 6
On 11/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
It is 31 instead of 32 because there's discontinuity at the edge (I
see no trivial way of fixing that, other than maybe just add +1 to all
values).
... or maybe wrap around and add +1 if first and last bit differ
(dunno if that makes
What I essentially showed using the compression example, is that in a
digital binary sequence of digits, each 'constant 0' or 'constant 1'
segment of length N can be represented in log2(N) number of bits,
because for me it is enough to represent the length of the segment to
fully reconstruct it.
On 11/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
Don't forget that the central point of Shannon entropy is: how many
bits do we (minimally) need to represent this.
Which is (on some level) essentially a 'data compression' problem -
the Shannon entropy is the length of the output
On 11/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
On 11/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
Don't forget that the central point of Shannon entropy is: how many
bits do we (minimally) need to represent this.
Which is (on some level) essentially a 'data compression
Are you suggesting I should unsubscribe from this mailing list?
If you're not interested in the topic, let me ask, why are you
subscribed to this list?
On 11/10/2014, Paul Stoffregen p...@pjrc.com wrote:
Pater, since roughly this time 5 days ago, you've posted 61 public
messages here.
Maybe
On 11/10/2014, Paul Stoffregen p...@pjrc.com wrote:
Maybe it's time to give it a rest? Or if not, perhaps your point
(whatever that may be) could be made with only 1 or 2 messages per day?
Please?!
Maybe, it's time to switch your mailing list subscription to 'daily digest' ?
Please?!
You
On 11/10/2014, Richard Dobson richarddob...@blueyonder.co.uk wrote:
I can't easily tell, when each says they know
what they are talking about but the other doesn't, whom to believe more,
and I have to toss a coin. or assume both are right
Both of us say: we cannot _precisely_ estimate entropy
n 11/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
On 11/10/2014, Richard Dobson richarddob...@blueyonder.co.uk wrote:
I can't easily tell, when each says they know
what they are talking about but the other doesn't, whom to believe more,
and I have to toss a coin. or assume both
On 11/10/2014, r...@audioimagination.com r...@audioimagination.com wrote:
all decompression is is decoding. you have tokens (usually binary bits or
a collection of bits) and a code book (this is something that you need to
understand regarding Huffman or entropy coding), you take the token and
On 12/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
What I'm trying to find out is:
- What is the entropy distribution (information distribution) of the
message?
- Where _exactly_ is the entropy (information) located in the message?
- Could that entropy be extracted or estimated
On 12/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
To me, this message can be clearly separated into three distinct parts:
000 - almost no information, all zeros
1001011001101 - lots of information (lots of entropy)
0 - almost
On 12/10/2014, Richard Wentk rich...@wentk.com wrote:
Relying on human pattern recognition skills to say 'oh look, here's a
repeating bit pattern' says nothing useful about Shannon entropy.
The whole point of Shannon analysis is that it's explicit, completely
defined, robust, and
So, for more clarity, my algorithm would segment the following bit pattern
00010010110011010
...into this:
000 --- log2(27) = ~4.754
1 --- 1
00 --- 1
1 --- 1
0 --- 1
11 --- 1
00 --- 1
11 --- 1
0 --- 1
1 --- 1
On 12/10/2014, Richard Wentk rich...@wentk.com wrote:
Yes, great. Now how many bits does a noisy channel need to flip before your
scheme produces gibberish?
Those flipped noise bits add entropy to the message, precisely.
Which my algorithm detects, correctly, since your noise is an entropy
On 12/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
On 12/10/2014, Richard Wentk rich...@wentk.com wrote:
Yes, great. Now how many bits does a noisy channel need to flip before
your
scheme produces gibberish?
Those flipped noise bits add entropy to the message, precisely.
Which my
On 12/10/2014, Sampo Syreeni de...@iki.fi wrote:
Your earlier algorithm just segments bitstrings. It doesn't tell you
how to assemble those segments back into a code which can be understood
unambiguously by any receiver.
There is no way that could be possible.
And I never claimed that. Maybe
Well, if you prefer, you can call my algorithm 'randomness estimator'
or 'noise estimator' instead. Personally I prefer to call it 'entropy
estimator', because the more random a message is, the more information
(=entropy) it contains.
I fail to see why you guys don't realize this trivial
Also randomness correlates with surprise, so if you treat entropy
as how likely are we to get surprises, then randomness correlates
with entropy.
But this is just another way of saying a more random message contains
more information (=entropy).
--
dupswapdrop -- the music-dsp mailing list and
On 12/10/2014, Sampo Syreeni de...@iki.fi wrote:
Define random.
As I told, my randomness estimation metric is: number of binary state
transitions.
It is a very good indicator of randomness, feel free test on
real-world data or pseudorandom number generators.
--
dupswapdrop -- the music-dsp
On 12/10/2014, Sampo Syreeni de...@iki.fi wrote:
Your message is lost in those fifty self-reflective, little posts of
yours. Which is precisely why you were already told to dial it back a
bit. I'd also urge you to take up that basic information theory textbook
I already linked for you, shut
On 12/10/2014, Sampo Syreeni de...@iki.fi wrote:
So by your metric a fully deterministic, binary source which always
changes state has the maximum entropy?
010101010101010101010101...
How do you know that that signal is 'fully deterministic', and not a
result of coin flips?
As for entropy
On 12/10/2014, Paul Stoffregen p...@pjrc.com wrote:
As long as you produce only chatter on mail lists, but no working
implementation, I really don't think there's much cause for anyone to be
concerned.
I have several working implementations, and I'll post one if you're a
bit patient.
--
On 12/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
On 12/10/2014, Paul Stoffregen p...@pjrc.com wrote:
As long as you produce only chatter on mail lists, but no working
implementation, I really don't think there's much cause for anyone to be
concerned.
I have several working
On 12/10/2014, Sampo Syreeni de...@iki.fi wrote:
On 2014-10-12, Peter S wrote:
010101010101010101010101...
How do you know that that signal is 'fully deterministic', and not a
result of coin flips?
Because I assumed it to be. Let's say I sent it to you and just made
On 12/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
When you're trying to approximate entropy of some arbitrary signal,
there is no such context.
... say, when you're a cryptographer, and want to decide if a certain
stream of bits would be safe enough to protect your bank account
access
On 12/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
On 12/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
When you're trying to approximate entropy of some arbitrary signal,
there is no such context.
... say, when you're a cryptographer, and want to decide if a certain
stream
On 12/10/2014, Sampo Syreeni de...@iki.fi wrote:
Very much my point: Shannon's definition of information is fully immune
to ROT13. Yours is not.
Correction: no 'information theory' model was proposed, and no form of
'immunity' was claimed. My algorithm is just an approximation, and
even a very
...and let me point out you admitted yourself that you have no clue of
the topic:
On 12/10/2014, Sampo Syreeni de...@iki.fi wrote:
As for entropy estimators, [...] I too once thought
that I had a hang of it, purely by intuition, but fuck no; the live
researchers at cryptography -list taught me
On 12/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
Correction: no 'information theory' model was proposed, and no form of
'immunity' was claimed.
What was claimed:
number of binary transitions _correlates_ with entropy (statistically)
Was NOT claimed:
number of binary transitions
On 12/10/2014, Theo Verelst theo...@theover.org wrote:
But the measure of entropy is still a statistical measure, based on a
distribution which is a *given* prob. dist., i.e. either *you* are
saying something with it by having one or more possible givens' that
you every time don't make
On 12/10/2014, Sampo Syreeni de...@iki.fi wrote:
On 2014-10-12, Peter S wrote:
Rather, please go and read some cryptography papers about entropy
estimation. Then come back, and we can talk further.
PLONK.
This could be a good start for you:
https://en.wikipedia.org/wiki/Entropy_estimation
For advanced topics, also feel free to consult:
^ Marek Lesniewicz (2014) Expected Entropy as a Measure and Criterion
of Randomness of Binary Sequences [1] In Przeglad Elektrotechniczny,
Volume 90, pp. 42– 46.
^ Dinh-Tuan Pham (2004) Fast algorithms for mutual information based
independent
Again, to understand what 'entropy estimation' is, please feel free to
consult the scientific literature list I posted.
If your mind is poisoned by academic books so much that you're unable
think 'outside the box' any longer, then you will never grasp this
concept.
--
dupswapdrop -- the music-dsp
On 13/10/2014, r...@audioimagination.com r...@audioimagination.com wrote:
What was claimed:
number of binary transitions _correlates_ with entropy (statistically)
it's a mistaken claim, Peter. in case you hadn't gotten it, you're getting
a bit outa your league. there are some very sharp and
Let's imagine that I'm a banker, and you open a savings account in my
bank, to put your life savings into my bank. So you send your life
savings to me, say you send me $1,000,000 dollars.
When you want to access your money, we communicate via messages. You
have a secret password, and you send it
...which implies, that the Shannon entropy problem is - on one level -
a guessing game problem - What is the minimal amount of 'yes/no'
questions to guess your data with 100% probability?
The more random your data is, the harder it is to guess.
--
dupswapdrop -- the music-dsp mailing list and
n 13/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
When we try to estimate the entropy of your secret, we cannot
_precisely_ know it. (*)
(*) ... except when none of your symbols are correlated, for example
when they come from a cryptographically secure random number
generator
On 14/10/2014, ro...@khitchdee.com ro...@khitchdee.com wrote:
Peter,
How would you characterize the impact of your posts on the entropy of this
mailing list, starting with the symbol space that get's defined by the
different perspectives on entropy :-)
I merely showed that:
1) 'entropy'
Again, the minimal number of 'yes/no' questions needed to guess your
message with 100% probability is _precisely_ the Shannon entropy of
the message:
For the case of equal probabilities (i.e. each message is equally
probable), the Shannon entropy (in bits) is just the number of yes/no
questions
Another way of expressing what my algorithm does: it estimates
'decorrelation' in the message by doing a simple first-order
approximation of decorrelation between bits. The more random a
message is, the more decorrelated their bits are. Otherwise, if the
bits are correlated, that is not random and
So, instead of academic hocus-pocus and arguing about formalisms, what
I'm rather concerned about is:
- What are the real-world implications of the Shannon entropy problem?
- How could we possibly use this to categorize arbitrary data?
--
dupswapdrop -- the music-dsp mailing list and website:
On 14/10/2014, Max Little max.a.lit...@gmail.com wrote:
P.S. Any chance people could go offline for this thread now please?
It's really jamming up my inbox and I don't want to unsubscribe ...
Any chance your mailbox has the possibility of setting up a filter
that moves messages with the
On 14/10/2014, Max Little max.a.lit...@gmail.com wrote:
Some might find it amusing and relevant to this discussion that the
'Hartley entropy' H_0 is defined as the base 2 log of the cardinality
of the sample space of the random variable ...
Which implies, that if the symbol space is binary (0
Which is another way of saying: a fully decorrelated sequence of bits
has the maximum amount of entropy.
So if we try to estimate the 'decorrelation' (randomness) in the
signal, then we can estimate 'entropy'.
--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ,
On 14/10/2014, Max Little max.a.lit...@gmail.com wrote:
Well, it just says that there is a measure of information for which
the actual distribution of symbols is (effectively) irrelevant. Which
is interesting in it's own right ...
Feel free to think outside the box.
Welcome to the real world,
On 14/10/2014, Sampo Syreeni de...@iki.fi wrote:
We do know this stuff. We already took the red pill, *ages* ago. Peter's
problem appears to be that he's hesitant to take the plunge into the
math, proper. Starting with the basics
Didn't you recently tell us that you have no clue of 'entropy
1 - 100 of 279 matches
Mail list logo