### Re: [Fis] Data - Reflection - Information

```Dear Bob,

In your Shannon Exonerata paper you have an example of three strings, their
entropies and their mutual information. I very much admire this paper and
particularly the critique  of Shannon and the emphasis on the apophatic, but
some things puzzle me. If these are strings of a living thing, then we can
assume that these strings grow over time. If sequences A,B and C are related,
then the growth of one is dependent on the growth of the other. This process
occurs in time. During the growth of the strings, even the determination of
what is and is not surprising changes with the distinction between what is seen
to be the same and what isn't.

I have begun to think that it's the relative entropy between growing things
(whether biological measurements, lines of musical counterpoint, learning) that
matters. Particularly as mutual information is a variety of relative entropy.
There are dynamics in the interactions. A change in entropy for one string with
no change in entropy in the others (melody and accompaniment) is distinct from
everything changing at the same time (that's "death and transfiguration"!).

Shannon's formula isn't good at measuring change in entropy. It's less good
with changes in distinctions which occur at critical moments ("aha! A
discovery!" Or "this is no longer surprising") The best that we might do, I've
thought, is segment your strings over time and examine relative entropies. I've
done this with music. Does anyone have any other techniques?

On the apophatic, I can imagine a study of the dynamics of Ashby's homeostat
where each unit produced one of your strings. The machine comes to its solution
when the entropies of the dials are each 0 (redundancy 1) As the machine
approaches its equilibrium, the constraint of each dial on every other can be
explored by the relative entropies between the dials. If we wanted the machine
to keep on searching and not settle, it's conceivable that you might add more
dials into the mechanism as its relative entropy started to approach 0. What
would this do? It would maintain a counterpoint in the relative entropies
within the ensemble. Would adding the dial increase the apophasis? Or the
entropy? Or the relative entropy?

Best wishes,

Mark

-Original Message-
From: "Robert E. Ulanowicz"
Sent: ‎09/‎10/‎2017 15:20
To: "Mark Johnson"
Cc: "foundationofinformationscience"
Subject: Re: [Fis] Data - Reflection - Information

> A perspectival shift can help of the kind that Gregory Bateson once talked
> about. When we look at a hand, do we see five fingers or four spaces?
> Discourses are a bit like fingers, aren't they?

Mark,

The absence of the absent was a major theme of Bateson's, and he
criticized physics for virtually neglecting the missing.

Fortunately, IT is predicated on the missing, and quantitative information
is a double negative (the reverse of what is missing). This makes IT a
prime tool for broadening our scope on reality.
&

Bob

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

```

### Re: [Fis] I agree with your considerations.

```Dear Krassimir and friends
I often follow Fis discussions with much interest but rarely contribute.
However, your current information/data debate is too good to miss; beautifully
summarised by Krassimir's simple equation, using carefully chosen entities.
Don't quite get the d=r.
It seems to me! Data (that which is given) is objective:
the combination of discrete entities or disturbances (energy bundles, photons,
sounds, numbers, letters etc) and... Information (that which is created) is
subjective: 'collated or interpreted data' dependent upon, and possibly
existing uniquely in, the eye/mind of each beholder. (your Else)
The ultimate Potential; awesome dark, silent, empty space is the medium within
which data is potentially transformed, by interpretation, into information,
meaning, beauty and Love.Can we see Light ? Not without objects/data to reveal
it. We think we see objects but actually we only see Light. It just needs
objects and observers for its Revelation. What colour is dew drop?!
Wow our universe is Wonder full yet so desperately sad too. Its almost as
if Love needs pain for its revelation, just as Light needs the Universe
Please forgive my ramblings; thank you very much Krassimir and all for focusing
my attention on these mysteries
Best wishes
David
PS Perhaps your 'r' is for 'reflection' of the light, sound

From: Sungchul Ji
To: Foundation of Information Science ;
"mar...@foibg.com"
Sent: Monday, 9 October 2017, 16:44
Subject: Re: [Fis] I agree with your considerations.

#yiv0460415527 -- .yiv0460415527EmailQuote
#yiv0460415527 --p {margin-top:0;margin-bottom:0;}#yiv0460415527 Hi Krassimir,
You have my permission.Good luck.
SungFrom: Fis  on behalf of Krassimir Markov

Sent: Monday, October 9, 2017 5:32:59 AM
To: Foundation of Information Science
Subject: [Fis] I agree with your considerations. Dear Yixin, Sung, Terry, Mark,
and FIS Colleagues,

Let me remark that the General Information Theory is much more than a
single concept. You have seen that I have some answers in advance due to

What is important now is to finish this step and after that to continue
with the next. It may be just the idea about meaning.

What we have till now is the understanding that the information is some
more than data. In other words:

d = r
i = r + e

where:

d => data;
i => information;
r = > reflection;
e => something Else, internal for the subject (interpreter, etc.).

I need a week to finish our common with you current publication and to
send it to co-authors for final editing and after that for reviewing.

Dear Sung, Terry, and Mark, if you agree and give me the permissions, I
shall include your considerations in the end of the paper in the point
"Future work" and shall include you in the co-authors of the paper.

My next (second) post will be at the end of week.

Thank you very much for your great effort!

Friendly greetings
Krassimir

___
Fis mailing list
Fis@listas.unizar.es
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

```

### Re: [Fis] I agree with your considerations.

```Hi Krassimir,

You have my permission.

Good luck.

Sung

From: Fis  on behalf of Krassimir Markov

Sent: Monday, October 9, 2017 5:32:59 AM
To: Foundation of Information Science
Subject: [Fis] I agree with your considerations.

Dear Yixin, Sung, Terry, Mark, and FIS Colleagues,

Let me remark that the General Information Theory is much more than a
single concept. You have seen that I have some answers in advance due to

What is important now is to finish this step and after that to continue
with the next. It may be just the idea about meaning.

What we have till now is the understanding that the information is some
more than data. In other words:

d = r
i = r + e

where:

d => data;
i => information;
r = > reflection;
e => something Else, internal for the subject (interpreter, etc.).

I need a week to finish our common with you current publication and to
send it to co-authors for final editing and after that for reviewing.

Dear Sung, Terry, and Mark, if you agree and give me the permissions, I
shall include your considerations in the end of the paper in the point
"Future work" and shall include you in the co-authors of the paper.

My next (second) post will be at the end of week.

Thank you very much for your great effort!

Friendly greetings
Krassimir

___
Fis mailing list
Fis@listas.unizar.es
___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

```

### Re: [Fis] Data - Reflection - Information

```
> A perspectival shift can help of the kind that Gregory Bateson once talked
> about. When we look at a hand, do we see five fingers or four spaces?
> Discourses are a bit like fingers, aren't they?

Mark,

The absence of the absent was a major theme of Bateson's, and he
criticized physics for virtually neglecting the missing.

Fortunately, IT is predicated on the missing, and quantitative information
is a double negative (the reverse of what is missing). This makes IT a
prime tool for broadening our scope on reality.
&

Bob

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

```

### Re: [Fis] Principles of Information

```

Dear Michel,

In the draft version of my post, mentioning the last Nobel Prize award I
have followed it by a remark:

All FISers pretend to be Einstein; no one bothers himself with a (LIGO)
detector building.

Then I decided that the phrase is unnecessary harsh and replaced it with the
"citations from Aristotle, Plato, and others passage.

You are right  the citations could be a particular type of IF assumption.
Generally they can, but in this case  they are not!

Loet has presented recently a much more elegant expression:

"Nobody of us provide an operative framework and a single (just one!)
empirical testable prevision able to assess "information".

Thank you for a concerned reading,

Best regards,

Emanuel.

---

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Michel Godron
Sent: Sunday, October 08, 2017 12:07 AM
To: fis@listas.unizar.es
Subject: Re: [Fis] Principles of Information

The "citations from Aristotle, Plato, Ortega, Leibnitz," are a particular
type of IF " hypothetic assumptions". They cannot be falsifiable as the
hypothesis of gravitional waves, but they may be discussed rationnally as
starting points for principles and definitions of information.

Cordialement. M. Godron

Le 06/10/2017 à 18:26, Emanuel Diamant a écrit :

Dear FISers,

I have heartily welcomed Pedros initiative to work out some principles of
information definition quest. But the upsetting discussion unrolled around
the issue pushes me to restrain my support for the Pedros proposal.  The
problem (in my understanding) is that FIS discussants are violating the
basic rule of any scientific discourse  the IF/THEN principle.

We usually start our discourse with a hypothetic assumption (the IF part of
an argument) which is affirmed later by a supporting evidence or by a
prediction that holds under the given assumptions (the THEN part of the
statement).

The universality of this principle was vividly demonstrated by the recent
Nobel Prize for Physics awarding 

A hundred years ago, Albert Einstein has predicted the existence of
gravitational waves, but only the construction of the LIGO detector
(implementing the if-then principles) made the observation of gravitational
waves possible.

Information will become visible and palpable only when an if-then grounded
probe (or an if-then grounded approach) will be devised and put in use.

Until then  long citations from Aristotle, Plato, Ortega, Leibnitz,
alongside with extensive self-citations, will not help us to master the
unavoidable if-then way of thinking.

Sincerely yours,

Emanuel.

__
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

```

### Re: [Fis] Data - Reflection - Information

```Dear Krassimir,

thank you for undertaking this project of an "Anthology of Contemporary
Ideas on Information".

Let me take up your kind encouragement (not published in fis) and offer the
following contribution (through fis, in order to be able to request the
colleagues to offer their comments: thanks):

(let me hope that the word processor of the fis server will deal correctly
with the formattings used)

The Algorithm of Information Content

The approach we propose needs some introduction, as it makes use of
combinations of techniques which have not been used together.

1) Material we work with

We begin by creating elements of a set.

The set we construct to demonstrate how to establish information content
consists of realisations of the logical sentence *a+b=c. *That is, if we
use *d *distinguishing categories of elements, we shall have a set that
contains the elements *{(1,1), (1,2), (2,2), (1,3), (2,3), (3,3), (1,4),
(2,4), …., (d,d)}. *These elements we refer to as *(a,b), a <=b.*

The number *n *of the elements of the set is of course dependent of *d*, *n
= f(d). * *n = d(d+1)/2.*

While the *principle *of information management is valid over a wide range
of values of *d*, it can be shown (OEIS A242615) that for reasons of
numerical facts, the *efficiency *of information management is the highest
when using *d=16, *which yields *n=136. *Nature also appears to use the
mathematically optimal method of information transmission.

2) What we look into the material 1: properties of the elements

We use the set of elements created such as a kind of Rorschach cards,
looking aspects into them.

We use, next to the traditional aspects *{a, b, c=a+b}* some additional
aspects of *a+b=c *also, namely *{u=b-a, k=2b-a, t=3b-2a, q=2a-b, s=(d+1) -
(a+b), w=3a-2b},* that is, altogether *9 aspects of a+b=c. *

Users are of course free and invited to introduce additional or different
aspects to categorise logical sentences with. The *number* of aspects needs
not to be higher of 8 if they are used in combination (we refer here again
to the facts discussed in OEIS A242615), and as to the *kinds* of aspects:
one is always open to improvements.

3) What we look into the material 2: properties of the set

We impose sequential orders on the elements of the set, using combinations
of aspects.

We generate *sequencing aspects * by using always 2 of the 9 *primary
aspects, *by creating sequential orders within the set such that each of
the primary aspects is once the *first* and once the *second *ordering
aspect. That is, we sequence the set on the criteria *{ab, ac, ak, au, …,
as, aw, ba, bc, bu, …, bw, ca, cb, …, cw, ka, kb, …, …, wt, ws}. * This
brings forth 72 sequential enumerations of the elements of the set. Of
these, about 20 are actually different. (The inexactitude regarding the
number of identical sequential enumerations has to do with the *sequence*
of the primary aspects and will be of fundamental importance in the course
of the applications of the model.)

The 72 different sequences the elements – of which some are different in
name only -  of the set have been brought into are called the *catalogued
sequences. *These are by no means random  but are as closely related to
each other as aspects of *a+b=c *can be closely related to each other. Each
of the catalogued sequences is equally legitimate and each is an implicated
corollary of *a+b=c, *now having been made explicit (=realised).

4) What we observe within the material 1: logical conflicts

We will not ignore conflicts between place and inhabitant, inhabitant and
place.

It is obvious that 2 different catalogued orders unveil logical conflicts.
If in order *αβ* element *e *is to be found on place *p1* and in order *γδ*
element *e *is to be found on place *p2*, there is apparently a conflict.

The same conflict can also be stated by using the formulation: If in order
*αβ* on place *p *element *e1* is to be found and in order *γδ *on place *p
*element *e2* is to be found, there is apparently a conflict.

We observe potentially or actually conflicting assignments of a sequential
number *{1..n}*  to one and the same element of the set, in dependence of
which of the catalogued orders we deem to be actually the case. As we
decline to entertain an epistemological attitude of human decisions
creating and ceasing logical conflicts, we look into methods of solving
these potential and realised conflicts. As the two orders *αβ* and *γδ*, if
they are different, create dislocations of the elements relative to their
own conceptions of which is the correct place for an individual element to
be in, we speak of *a consolidation of dislocations* that we aim at.

5) What we look into the material 3: series of place changes

We transform linear arrangement *αβ* into linear arrangement *γδ*.

We have observed that if we once order the set sequentially according to
sorting order *αβ* (say, e.g. on arguments: *a,b*), and then ```

### [Fis] I agree with your considerations.

```Dear Yixin, Sung, Terry, Mark, and FIS Colleagues,

Let me remark that the General Information Theory is much more than a
single concept. You have seen that I have some answers in advance due to

What is important now is to finish this step and after that to continue
with the next. It may be just the idea about meaning.

What we have till now is the understanding that the information is some
more than data. In other words:

d = r
i = r + e

where:

d => data;
i => information;
r = > reflection;
e => something Else, internal for the subject (interpreter, etc.).

I need a week to finish our common with you current publication and to
send it to co-authors for final editing and after that for reviewing.

Dear Sung, Terry, and Mark, if you agree and give me the permissions, I
shall include your considerations in the end of the paper in the point
"Future work" and shall include you in the co-authors of the paper.

My next (second) post will be at the end of week.

Thank you very much for your great effort!

Friendly greetings
Krassimir

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

```

### Re: [Fis] Data - Reflection - Information

```Which "information paradigm" is not a discourse framed by the education system?
The value of the discussion about information - circular though it appears to
be  - is that we float between discourses. This is a strength. But it is also
the reason why we might feel we're not getting anywhere!

A perspectival shift can help of the kind that Gregory Bateson once talked
about. When we look at a hand, do we see five fingers or four spaces?
Discourses are a bit like fingers, aren't they?

Mark

From: Terrence W. DEACON
Sent: ‎09/

-Original Message-
From: "Terrence W. DEACON"
Sent: ‎09/‎10/‎2017 01:31
To: "Sungchul Ji"
Cc: "foundationofinformationscience"
Subject: Re: [Fis] Data - Reflection - Information

Against "meaning"

I think that there is a danger of allowing our anthropocentrism to bias the
discussion. I worry that the term 'meaning' carries too much of a linguistic
bias.
By this I mean that it is too attractive to use language as our archtypical
model when we talk about information.
Language is rather the special case, the most unusual communicative adaptation
to ever have evolved, and one that grows out of and depends on
informationa/semiotic capacities shared with other species and with biology in
general.
So I am happy to see efforts to bring in topics like music or natural signs
like thunderstorms and would also want to cast the net well beyond humans to
include animal calls, scent trails, and molecular signaling by hormones. And it
is why I am more attracted to Peirce and worried about the use of Saussurean
concepts.
Words and sentences can indeed provide meanings (as in Frege's Sinn - "sense" -
"intension") and may also provide reference (Frege's Bedeutung - "reference" -
"extension"), but I think that it is important to recognize that not all signs
fit this model. Moreover,

A sneeze is often interpreted as evidence about someone's state of health, and
a clap of thunder may indicate an approaching storm.
These can also be interpreted differently by my dog, but it is still
information about something, even though I would not say that they mean
something to that interpreter. Both of these phenomena can be said to provide
reference to something other than that sound itself, but when we use such
phrases as "it means you have a cold" or "that means that a storm is
approaching" we are using the term "means" somewhat metaphorically (most often
in place of the more accurate term "indicates").

And it is even more of a stretch to use this term with respect to pictures or
diagrams.
So no one would say the a specific feature like the ears in a caricatured face
mean something.
Though if the drawing is employed in a political cartoon e.g. with exaggerated
ears and the whole cartoon is assigned a meaning then perhaps the exaggeration
of this feature may become meaningful. And yet we would probably agree that
every line of the drawing provides information contributing to that meaning.

So basically, I am advocating an effort to broaden our discussions and
recognize that the term information applies in diverse ways to many different
contexts. And because of this it is important to indicate the framing, whether
physical, formal, biological, phenomenological, linguistic, etc.
For this reason, as I have suggested before, I would love to have a
conversation in which we try to agree about which different uses of the
information concept are appropriate for which contexts. The classic
syntax-semantics-pragmatics distinction introduced by Charles Morris has often
been cited in this respect, though it too is in my opinion too limited to the
suggested a parallel, less linguistic (and nested in Stan's subsumption sense)
way of making the division: i.e. into intrinsic, referential, and normative
analyses/properties of information.

Thus you can analyze intrinsic properties of an informing medium [e.g. Shannon
etc etc] irrespective of these other properties, but can't make sense of
referential properties [e.g. what something is about, conveys] without
considering intrinsic sign vehicle properties, and can't deal with normative
properties [e.g. use value, contribution to function, significance, accuracy,
truth] without also considering referential properties [e.g. what it is about].

In this respect, I am also in agreement with those who have pointed out that
whenever we consider referential and normative properties we must also
recognize that these are not intrinsic and are interpretation-relative.
Nevertheless, these are legitimate and not merely subjective or nonscientific
properties, just not physically intrinsic. I am sympathetic with those among us
who want to restrict analysis to intrinsic properties alone, and who defend ```