On 11/10/2014, r...@audioimagination.com r...@audioimagination.com wrote:
all decompression is is decoding. you have tokens (usually binary bits or
a collection of bits) and a code book (this is something that you need to
understand regarding Huffman or entropy coding), you take the token and
On 12/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
What I'm trying to find out is:
- What is the entropy distribution (information distribution) of the
message?
- Where _exactly_ is the entropy (information) located in the message?
- Could that entropy be extracted or estimated
On 12/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
To me, this message can be clearly separated into three distinct parts:
000 - almost no information, all zeros
1001011001101 - lots of information (lots of entropy)
0 - almost no
On 12/10/2014 11:31, Peter S wrote:
On 12/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
To me, this message can be clearly separated into three distinct parts:
000 - almost no information, all zeros
1001011001101 - lots of information (lots of entropy)
None at all, because Shannon only makes sense if you define your symbols first,
or define the explicit algorithm used to specify symbols.
Relying on human pattern recognition skills to say 'oh look, here's a repeating
bit pattern' says nothing useful about Shannon entropy.
The whole point
This is covered by Quine and Shannon, although I cannot cite you chapter and
verse.
Basically, you are correct. A message alone is only half of the story: relative
to some pre-agreed decoder matrix defining lowest entropy. (Contrast
to chemical entropy which has a fixed baseline set by physics of
On 12/10/2014, Richard Wentk rich...@wentk.com wrote:
Relying on human pattern recognition skills to say 'oh look, here's a
repeating bit pattern' says nothing useful about Shannon entropy.
The whole point of Shannon analysis is that it's explicit, completely
defined, robust, and
So, for more clarity, my algorithm would segment the following bit pattern
00010010110011010
...into this:
000 --- log2(27) = ~4.754
1 --- 1
00 --- 1
1 --- 1
0 --- 1
11 --- 1
00 --- 1
11 --- 1
0 --- 1
1 --- 1
On 2014-10-12, Peter S wrote:
To demonstrate this through an example - could you point out to
_where_ the most amount of information is located in the following
message?
00010010110011010
There is absolutely no way of knowing that unless you
On 10/12/2014 04:36 AM, Peter S wrote:
So, for more clarity, my algorithm would segment the following bit pattern
Perhaps for better clarity, you could provide a reference implementation
in C, C++, Python or any other widely used programming language?
On 2014-10-12, Rohit Agarwal wrote:
You need to show an exclusive 1:1 mapping between your source symbol
space and your encoded symbol space. Then you can determine output
bitrate based on the probabilities of your source symbols and the
lengths of your encoded symbols. One way to do this is
Yes, great. Now how many bits does a noisy channel need to flip before your
scheme produces gibberish?
Richard
On 12 Oct 2014, at 12:36, Peter S peter.schoffhauz...@gmail.com wrote:
So, for more clarity, my algorithm would segment the following bit pattern
On 12/10/2014, Richard Wentk rich...@wentk.com wrote:
Yes, great. Now how many bits does a noisy channel need to flip before your
scheme produces gibberish?
Those flipped noise bits add entropy to the message, precisely.
Which my algorithm detects, correctly, since your noise is an entropy
On 2014-10-12, Peter S wrote:
Those flipped noise bits add entropy to the message, precisely.
No, they do not, if they just follow the same statistics as the original
message.
Which my algorithm detects, correctly, since your noise is an entropy
source.
If your algorithm can detect any
On 12/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
On 12/10/2014, Richard Wentk rich...@wentk.com wrote:
Yes, great. Now how many bits does a noisy channel need to flip before
your
scheme produces gibberish?
Those flipped noise bits add entropy to the message, precisely.
Which my
On 12/10/2014, Sampo Syreeni de...@iki.fi wrote:
Your earlier algorithm just segments bitstrings. It doesn't tell you
how to assemble those segments back into a code which can be understood
unambiguously by any receiver.
There is no way that could be possible.
And I never claimed that. Maybe
Well, if you prefer, you can call my algorithm 'randomness estimator'
or 'noise estimator' instead. Personally I prefer to call it 'entropy
estimator', because the more random a message is, the more information
(=entropy) it contains.
I fail to see why you guys don't realize this trivial
Also randomness correlates with surprise, so if you treat entropy
as how likely are we to get surprises, then randomness correlates
with entropy.
But this is just another way of saying a more random message contains
more information (=entropy).
--
dupswapdrop -- the music-dsp mailing list and
On 2014-10-12, Peter S wrote:
Well, if you prefer, you can call my algorithm 'randomness estimator'
or 'noise estimator' instead. Personally I prefer to call it 'entropy
estimator', because the more random a message is, the more information
(=entropy) it contains.
Define random.
I fail to
About the hidden information in presumed noise: *Given* that there is
a hidden generator of the noise (like a standard software pseudo
random generator), or some other form of noise pattern, *THEN* you could
try to find it, and for that, it may be equally hard as to crack the key
out of a 1024
On 2014-10-12, Peter S wrote:
Again, it seems my message was lost in translation somewhere...
Your message is lost in those fifty self-reflective, little posts of
yours. Which is precisely why you were already told to dial it back a
bit. I'd also urge you to take up that basic information
On 12/10/2014, Sampo Syreeni de...@iki.fi wrote:
Define random.
As I told, my randomness estimation metric is: number of binary state
transitions.
It is a very good indicator of randomness, feel free test on
real-world data or pseudorandom number generators.
--
dupswapdrop -- the music-dsp
On 12/10/2014, Sampo Syreeni de...@iki.fi wrote:
Your message is lost in those fifty self-reflective, little posts of
yours. Which is precisely why you were already told to dial it back a
bit. I'd also urge you to take up that basic information theory textbook
I already linked for you, shut
On 2014-10-12, Peter S wrote:
Define random.
As I told, my randomness estimation metric is: number of binary state
transitions.
It is a very good indicator of randomness, feel free test on
real-world data or pseudorandom number generators.
So by your metric a fully deterministic, binary
On 12/10/2014, Sampo Syreeni de...@iki.fi wrote:
So by your metric a fully deterministic, binary source which always
changes state has the maximum entropy?
010101010101010101010101...
How do you know that that signal is 'fully deterministic', and not a
result of coin flips?
As for entropy
On 2014-10-12, Peter S wrote:
010101010101010101010101...
How do you know that that signal is 'fully deterministic', and not a
result of coin flips?
Because I assumed it to be. Let's say I sent it to you and just made it
into a deterministic generator. Without telling you how it was
On 12/10/2014, Paul Stoffregen p...@pjrc.com wrote:
As long as you produce only chatter on mail lists, but no working
implementation, I really don't think there's much cause for anyone to be
concerned.
I have several working implementations, and I'll post one if you're a
bit patient.
--
On 12/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
On 12/10/2014, Paul Stoffregen p...@pjrc.com wrote:
As long as you produce only chatter on mail lists, but no working
implementation, I really don't think there's much cause for anyone to be
concerned.
I have several working
On 12/10/2014, Sampo Syreeni de...@iki.fi wrote:
On 2014-10-12, Peter S wrote:
010101010101010101010101...
How do you know that that signal is 'fully deterministic', and not a
result of coin flips?
Because I assumed it to be. Let's say I sent it to you and just made it
into a
On 2014-10-12, Peter S wrote:
Because I assumed it to be. Let's say I sent it to you and just made
it into a deterministic generator. Without telling you how it was
generated. Where is the information, from *your* viewpoint?
In what context? You sent me a long stream of '0101010101...' So?
On 12/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
When you're trying to approximate entropy of some arbitrary signal,
there is no such context.
... say, when you're a cryptographer, and want to decide if a certain
stream of bits would be safe enough to protect your bank account
access
On 12/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
On 12/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
When you're trying to approximate entropy of some arbitrary signal,
there is no such context.
... say, when you're a cryptographer, and want to decide if a certain
stream of
On 2014-10-12, Peter S wrote:
When you're trying to approximate entropy of some arbitrary signal,
there is no such context.
Of course there is. Each and every one of the classical, dynamically
updated probability models in text compression has one, too. The best
ones even have papers behind
Peter S wrote:
...
... say, when you're a cryptographer, and want to decide if a certain
stream of bits would be safe enough ...
But the measure of entropy is still a statistical measure, based on a
distribution which is a *given* prob. dist., i.e. either *you* are
saying something with it
On 12/10/2014, Sampo Syreeni de...@iki.fi wrote:
Very much my point: Shannon's definition of information is fully immune
to ROT13. Yours is not.
Correction: no 'information theory' model was proposed, and no form of
'immunity' was claimed. My algorithm is just an approximation, and
even a very
...and let me point out you admitted yourself that you have no clue of
the topic:
On 12/10/2014, Sampo Syreeni de...@iki.fi wrote:
As for entropy estimators, [...] I too once thought
that I had a hang of it, purely by intuition, but fuck no; the live
researchers at cryptography -list taught me
On 12/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
Correction: no 'information theory' model was proposed, and no form of
'immunity' was claimed.
What was claimed:
number of binary transitions _correlates_ with entropy (statistically)
Was NOT claimed:
number of binary transitions
On 2014-10-12, Peter S wrote:
Rather, please go and read some cryptography papers about entropy
estimation. Then come back, and we can talk further.
PLONK.
--
Sampo Syreeni, aka decoy - de...@iki.fi, http://decoy.iki.fi/front
+358-40-3255353, 025E D175 ABE5 027C 9494 EEB0 E090 8BA9 0509 85C2
On 12/10/2014, Theo Verelst theo...@theover.org wrote:
But the measure of entropy is still a statistical measure, based on a
distribution which is a *given* prob. dist., i.e. either *you* are
saying something with it by having one or more possible givens' that
you every time don't make
On 12/10/2014, Sampo Syreeni de...@iki.fi wrote:
On 2014-10-12, Peter S wrote:
Rather, please go and read some cryptography papers about entropy
estimation. Then come back, and we can talk further.
PLONK.
This could be a good start for you:
https://en.wikipedia.org/wiki/Entropy_estimation
For advanced topics, also feel free to consult:
^ Marek Lesniewicz (2014) Expected Entropy as a Measure and Criterion
of Randomness of Binary Sequences [1] In Przeglad Elektrotechniczny,
Volume 90, pp. 42– 46.
^ Dinh-Tuan Pham (2004) Fast algorithms for mutual information based
independent
I've been working on a MIDI Karplus-Strong synthesizer using a Microchip
dsPIC33F (priced around $5.50 for tiny quantities). The synthesizer is
polyphonic with 12 voices and has both pots and MIDI CC inputs to control
it's timbre in real time. It also supports pitch bend. The code (all
�Peter S peter.schoffhau...@gmail.com wrote
On 12/10/2014, Peter S peter.schoffhau...@gmail.com wrote:
Correction: no 'information theory' model was proposed, and no form of
'immunity' was claimed.
What was claimed:
number of binary transitions _correlates_ with entropy
Sounds like a fun project Scott.
One question though:
Sample rate is approximately 44.6 kHz.
What's with the non-standard sampling rate?
E
On Sun, Oct 12, 2014 at 5:25 PM, Scott Gravenhorst music.ma...@gte.net
wrote:
I've been working on a MIDI Karplus-Strong synthesizer using a Microchip
44 matches
Mail list logo