Pridi, Krassimir,  List:

(In order to place this comment in context, and for reference, I have copied 
Krassimir's "definition" of information below. My comments follow the excellent 
post of Pridi.)

> In physical world there exist only reflections but not information. 
> 
> Information “ i " is the quadruple: 
> i = (s, r, e, I) 
> where 
> s is a source entity, which is reflected in r 
> r is the entity in which reflection of s exists 
> e is an evidence for the subject I which proofs for him and only for him that 
> the reflection in r reflects just s , i.e. the evidence proofs for the 
> subject what the reflection reflects . 
> I is information subject who has possibility to make decisions in accordance 
> with some goals – human, animal, bacteria, artificial intelligent system, 
> etc. 
> 
> In other words, information is a reflection, but not every reflection is 
> information – only reflections for which the quadruple above exist are 
> assumed as information by the corresponded subjects. 
> 
> For different I , information may be different because of subjects’ finite 
> memory and reflection possibilities. 
> Because of this, a physical event with an infinite bandwidth may have finite 
> information content (for concrete information subject) . 
On Jul 23, 2014, at 6:45 AM, Pridi Siregar wrote:

> Dear Krassimir,
> 
> Thank you for your explanation. It does give me a better understanding of how 
> information (beyond Shannon) can be formalized! However, a closer look at the 
> formalism and its semantic does raise new questions:
> 
> From the definition you have given, it appears that information cannot be 
> viewed in any absolute sense but as internal representations of "external 
> patterns" whose meaning depends on the subject capturing/interpreting/storing 
> the said patterns. In this framework then, it seems that "information" cannot 
> be conceptualized without reference to the both "something out there" and the 
> "internal structures" of the receptor/cognitive system. 
> 
> In other words the concept of "information" lies within some "subjective" 
> (albeit rational) realm. I'm sure that I'm stating the obvious for most of 
> FIS members but a question arised upon reading your formalism: How can we 
> really quantify meaningful (semantic) information beyond Shannon (that 
> disregards semantics) and his purely statistical framework? Or beyond 
> Boltzmann's entropy/Information based on micro-macro states ratios?
> 
> When we formalize i = (s, r, e, I) there is  a "meta-level" formalisation 
> that is only apparent since even (s,r) reflect our own (human) subjective 
> world-view. We could actually write (I1(s), I1(r), e, I2) where I1 and I2 are 
> two distinct cognitive systems and both of which lie at the OBJECT level of 
> the formalizing agent which is NEITHER I1 or I2. All "objective" measures 
> (entropy, negentropy,...) are actually totally dependant of I1 and I2 and can 
> never be considered as "absolute". 
> 
> 
> This leads me to a second question (sorry for the lengthy message): there are 
> some researchers that posit that "information" may be more fundamental than 
> the fundamental physical (mass, time, space, amps). This appears (and perhaps 
> only appears) to be at the opposite end of the above-mentioned view. Indeed, 
> in this framework some kind of "universal" or "absolute" notions must be 
> accepted as true.
> 
> One apparent way out would be to demonstrate that information somehow 
> logically entails the fundemantal physical entities while accepting that we 
> are still within a human-centered  world view. And thus no "absolute truth" 
> (whatever this means) is really gained. "Only" a richer more complete 
> (subjective but coherent) world-view .
> 
> Am I making anys sense? Any thoughts?
> 
> Best
> 
> Pridi         
> 

Pridi's comment concur with many of my views wrt the concept of information. 

Krassimir's assertion of a quadruple of symbols is rather close to the 
philosophy of C S Peirce (hereafter "CSP") in one context.

S as symbol represents an external source of signal, that which is independent 
of the individual mind and being.  This is analogous to CSP's term "sinsign".

R is a thing itself.  That is, R generates S.

E as evidence is a vague term which infers an observer (2nd Order Cybernetics?) 
that both receives and evaluates the signal (S) from the thing (R).  CSP 
categorizes evidence as icon, index or symbol with respect to the entity of 
observation.

I  as Krassimirian information is a personal judgment about the evidence.  
(Correspondence with CSP's notion of "argument" is conceivable.) 

Krassimir's assertion that: 
> For different I , information may be different because of subjects’ finite 
> memory and reflection possibilities. 
> Because of this, a physical event with an infinite bandwidth may have finite 
> information content (for concrete information subject) . 

 
moves these 'definitions' of individual symbols into the subjective realm. 
(CSP's notion of "interpretation?)
Different researchers have the freedom to interpret the evidence as they 
choose, including the relationships to engineering terms such as "bandwidth".


Pridi's post appropriately recognizes the tension between objective scientific 
theories and subjective judgments about evidence by different  individuals with 
different professional backgrounds and different symbolic processing powers.

The challenge for Krassimirian information, it appears to me, is to show that 
these definitions of symbols motivate a coherent symbol system that can be used 
to transfer information contained in the signal from symbolic representations 
of entities. It may work for engineering purposes, but is it extendable to life?

(For me, of course, this requires the use of multiple symbol systems and 
multiple forms of logic in order to gain the functionality of transfer of 
"in-form" between individuals or machines.)

Pridi writes:
>  How can we really quantify meaningful (semantic) information beyond Shannon 
> (that disregards semantics) and his purely statistical framework?

One aspect of this conundrum was solved by chemists over the past to two 
centuries by developing a unique symbol system that is restricted by physical 
constraints, yet functions as an exact mode of communication. 

Chemical notation, as symbol system, along with mathematics and data, achieves 
this end purpose (entelechy) of communication, for some entities, such as the 
meaning of an "atomic number" as a relational term and hence the meaning of a 
particular integer as both quantity and quality. 

This requires a dyadic mathematics and synductive logic for sublations.


Pridi writes:

> It does give me a better understanding of how information (beyond Shannon) 
> can be formalized! 

Can you communicate how this "better understanding...   ... foramlized"  works? 
 

It is not readily apparent to me how Krassimirian information can be formalized.

Anybody have any suggestions on how this quadruple of symbols can be formalized 
into a quantitative coherent form of communication?

Cheers

Jerry 



 


 
_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to