>I understand an informative signals as one which contains patterns, as
>opposed to radomly distributed numbers e.g. noise. Therefore, I
>equate information with structure in the signals distribution. However,
>Shannon equates information with entropy, which is maximimum when each
>symbol in the signal is equally as likely as the next i.e. a distribution
>with no `structure'. These views are contradictory.

Both are valid views, I think - it depends on what you are after.

Intuitively, I also associate information with the presence of a pattern.

For example, in my thesis I was interested in reconstructing the value
of a variable Y given the value of another variable X (both real and
multidimensional). If p(Y|X=x) is flat, it does not constrain Y, while
if has sharp peaks, I can establish a (multivalued) mapping x -> Y.

So in that context I found it convenient to say that a density is
"informative" if its mass is concentrated around a low-dimensional
subset of its domain.

Miguel

-- 
Miguel A Carreira-Perpinan
Dept. of Neuroscience, Box 571464      Tel. (202) 6878679                  
School of Medicine                     Fax  (202) 6870617                  
Georgetown University                  mailto:miguel@;cns.georgetown.edu 
Washington, DC 20057-1464, USA         http://cns.georgetown.edu/~miguel

Reply via email to