On 7/18/15 10:10 PM, Peter S wrote:
It follows from the above, that Shannon's entropy model is a
simplified, idealized model of information, that pretends that
algorithms have zero length and thus no entropy, and you can magically
share a codebook telepathically without actually transmitting it.


Shannon makes no issue of that. if you need the codebook at the other end (because you don't already have it), you need to send it. it's not so big and i have seen that in some compressed files and i am sure .zip files (at least the old ones, that were mostly Huffman coded but also had some run-length encoding in the model) have the codebook in the file preamble.

even so, Shannon information theory is sorta static. it does not deal with the kind of redundancy of a repeated symbol or string. just with the probability of occurrence (which is measured from the relative frequency of occurrence) of messages. run-length coding isn't in there. maybe there is something in Shannon information theory about information measure with conditional probability (that might obviate what LPC can do to compress data).

--

r b-j                  r...@audioimagination.com

"Imagination is more important than knowledge."



--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp

Reply via email to