Hi Ronni,
I've been traveling and I was unable to say much about the book at the
time. This book seems to be an original work--I think it's a little
strange that most of the references are the original works about entropy,
fractal dimension, and the like. I would have liked to see some
Hi, Charles, my idea in using shannons entropy is to measure self
generated songs.
For example if you have a patch that generate sound structures using a
generative rules it would be nice to measure that sound structure and
use that measurement to evolve the rules that generate that sound
On Sat, Mar 2, 2013 at 12:28 PM, ronni montoya ronni.mont...@gmail.comwrote:
Hi, Charles, my idea in using shannons entropy is to measure self
generated songs.
For example if you have a patch that generate sound structures using a
generative rules it would be nice to measure that sound
On Sat, Mar 2, 2013 at 12:28 PM, ronni montoya ronni.mont...@gmail.comwrote:
In this case entropy varies with time and what i am interested in are
the entropy trayectories.
You can plot this trayectories and compare different trayectories from
different songs .
More complex sound
Hi, why is not possible? Instead of analysing the real time value of
the signal , maybe i can have a memory or buffer that store the a
piece of signal ( groups of samples) from time to time and then
analize that group of values.
Maybe it can convert that group of values into a string and then:
Why not do an FFT and measure the variance of the channels?
For instance white noise has maximum entropy and all the bins of its FFT
will be more or less the same, while a sine wave has low entropy and one
bin will be much larger than the others.
Martin
On 2013-02-27 08:40, ronni montoya
On Wed, Feb 27, 2013 at 7:40 AM, ronni montoya ronni.mont...@gmail.comwrote:
Hi, why is not possible?
What I mean is using floating point numbers, as an approximation of real
numbers. We have a finite number of samples, so it's impossible to work
with continuous distributions, except by
If you took the fft squared magnitude, perfectly noisy data should have a
chi-squared distribution in each bin (I think). If you assumed that model
and calculated the parameters of the distribution on each block, you'd find
out how much information is in each of those peaks relative to the
Hi , i was wondering if anybody have implemented the shannon entropy
function in pd?
Do anybody have tried measuring entropy of a signal?
cheeers
R.
___
Pd-list@iem.at mailing list
UNSUBSCRIBE and account-management -
Hi Ronni
How do you mean to do it?
Shannon entropy is not an independent measurement--the information in a
observation is relative to the distribution of all it's possible values.
If I just take one sample and it's evenly distributed between -0.98 and 1
and it's quantized in 0.02 increments (to
10 matches
Mail list logo