Excellent! Thanks. It's not clear to me why I get so confused. Every time I 
think about uncertainty and information, I have to Google concepts like entropy 
and negentropy and re-orient myself. I suppose I just don't do enough hands-on 
work with it to develop a tacit memory.

On 7/20/20 5:18 PM, David Eric Smith wrote:
> But if one did want to keep track of signs, I think in several sentences 
> below where Glen is talking about the presence of limitations’ reducing the 
> allowed variability in some distribution, we could say we use one or another 
> _entropy_ measure to quantify the reduction in likely variability.  To the 
> extent that one tries to characterize _information_ as Shannon did — a 
> measure of how much ambiguity in a sample is reduced by having some bit of 
> knowledge that rules out variations — then the reductions in entropy of the 
> constrained ensemble relative to its prior would be called a gain of 
> information in moving to the posterior from the prior.  So without worrying 
> about the zero-point for either of these measures, or their resulting 
> absolute signs, in many settings one would talk of the change of 
> information’s being positive when the change of the corresponding entropy is 
> negative.

-- 
↙↙↙ uǝlƃ

- .... . -..-. . -. -.. -..-. .. ... -..-. .... . .-. .
FRIAM Applied Complexity Group listserv
Zoom Fridays 9:30a-12p Mtn GMT-6  bit.ly/virtualfriam
un/subscribe http://redfish.com/mailman/listinfo/friam_redfish.com
archives: http://friam.471366.n2.nabble.com/
FRIAM-COMIC http://friam-comic.blogspot.com/ 

Reply via email to