> The reason of being of information, whatever its content or quantity, is
> to be used by an agent (biological or artificial).
In making this restriction you are limiting the domain of information to
communication and excluding all information that inheres in structure
per-se. John Collier has called the latter manifestation "enformation",
and the calculus of IT is quite effective in quantifying its extent.
Perhaps John would like to comment?
I developed this concept in order to reply to Jeff Wicken's complaint that Brooks and Wiley did not distinguish properly between the complement of entropy and structural information, but I used it in print to discuss, in the context of cognitive science and especially John Perry's use of information (see Barwise and Perry Situations and Attitudes and his What is information?, as well as Dretske's book on information and perception) what the world must be like in order to make sense of information coming from the world into our brains. The article can be found at Intrinsic Information (1990) In P. P. Hanson (ed) Information, Language and Cognition: Vancouver Studies in Cognitive Science, Vol. 1 (originally University of British Columbia Press, now Oxford University Press, 1990): 390-409. Details about information are there, but the gist of it is that can be measured, is unique, and depends on time scale to distinguish it from informational entropy in information systems. The uniqueness hypothesis was developed very carefully in my former student, Scott Muller's PhD thesis, published as Asymmetry: The Foundation of Information (The Frontiers Collection) by Springer in 2007.
I am rather busy now at a conference, or else I would say more here.
Professor John Collier colli...@ukzn.ac.za
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292 F: +27 (31) 260 3031
_______________________________________________ fis mailing list firstname.lastname@example.org https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis