On 20/07/2015, Peter S <peter.schoffhau...@gmail.com> wrote:
>
> If we extend this idea further and include not just the minimum size
> of the data of an ideal data compressor, but also the size of the
> decoder program as well and forget about the probabilities, then we
> get the algorithmic entropy (Kolmogorov complexity) C(c).

"Kolmogorov complexity is a modern notion of randomness dealing with
the quantity of information in individual objects; that is, pointwise
randomness rather than average randomness as produced by a random
source. It was proposed by A.N. Kolmogorov in 1965 to quantify the
randomness of individual objects in an objective and absolute manner.
This is impossible by classical probability theory (a branch of
measure theory satisfying the so-called Kolmogorov axioms formulated
in 1933). Kolmogorov complexity is known variously as `algorithmic
information', `algorithmic entropy', `Kolmogorov-Chaitin complexity',
`descriptional complexity', `shortest program length', `algorithmic
randomness', and others."

"The Kolmogorov complexity of an object is a form of absolute
information of the individual object. This is not possible to do by
C.E. Shannon's information theory. Unlike Kolmogorov complexity,
information theory is only concerned with the average information of a
random source."

"`Kolmogorov' complexity was earlier proposed by Ray Solomonoff in a
Zator Technical Report in 1960 and in a long paper in Information and
Control in 1964. Also Gregory Chaitin proposed this idea in a paper in
J. ACM in 1969. This paper continues a paper by G. Chaitin in J.ACM in
1966, which however was concerned with a complexity measure based on
Turing machine state-symbol product following ideas of C.E. Shannon.
Unlike Kolmogorov complexity, those complexity measures are not
objective and absolute (recursively invariant)."

"The incompressibility method and Kolmogorov complexity is truly a
versatile mathematical tool. It is a sharper relative of classical
information theory (absolute information of individual object rather
than average information over a random ensemble) and yet satisfies
many of the laws of classical information theory---although with a
slight error term."

Source: http://homepages.cwi.nl/~paulv/kolmogorov.html

Seems I'm not the first in the history of mankind to consider the
amount of information in an arbitrary piece of data that is not a
function of a probability distribution - others already did the same
about a half century ago.

-P
--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp

Reply via email to