Re: [Fis] a limited response

2017-03-30 Thread Andrew Fingelkurts / BM-Science
Dear FIS colleagues,

 

Being not a mathematician or a physicist, I would also apologise for
potentially not accurate usage of the term 'information'. At the same time I
tend to accept the below mentioned definition that "Information is a
description of structures observable in a given data set". I am working with
a living human brain and try to figure out how its dynamics is related with
the dynamics of the mind. It seems that mentioned information is indeed
present in the brain-mind system in the very special form of ordered
sequences of metastable states, that are instantiated by the nested
spatio-temporal coalitions among simple neuronal assemblies and larger
coupled conglomerates of them-so-called delocalised operational modules.
Those who are interested in details, may look at the recent concise open
access paper: http://www.mdpi.com/2078-2489/8/1/22  

 

More detailed analysis is provided in
http://www.bm-science.com/team/art61.pdf   and in
http://www.bm-science.com/team/art76.pdf

 

Thank you all for your time and consideration.

 

Greetings,

Andrew

 

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Emanuel Diamant
Sent: Tuesday, 28 March, 2017 18:36
To: fis@listas.unizar.es
Subject: [Fis] a limited response

 

Dear FIS colleagues,

 

As usual, I would like to begin with apologies. I apologize that because of
the gaps in my education I can only partially understand what is being said
in most of your mails. Therefore, I will only partially respond to those
segments of your posts that seem to me to be in the limits of my
understanding. 

 

To Karl Javorszky: 

At March 24, you wrote: "I have given in my work "Natural orders - de
ordinibus naturalibus" (ISBN  9783990571378) the following definition of the
term "information": Information is a description of what is not the case". 

I do not know "what is not the case", but I salute and welcome your
statement that "Information is a description:" I am also using (for a quite
a long time now) a similar definition: "Information is a description of
structures observable in a given data set". 

By saying this, I do not pretend to claim for priority or credits - all
credits must be directed to A. Kolmogorov who in his 1965 paper "Three
approaches to the quantitative definition of information" was the first who
has introduced the concept.

As all the other researchers of his time, Kolmogorov has developed his
information quantity measure for a linear one-dimensional communication
message data set. I have expanded and extended Kolmogorov's definition to a
two-dimensional data set. In a two-dimensional data set two types of
structures could be distinguished: primary (basic) data structures and
secondary (meaningful structures of structures) data arrangements. According
to the offered definition the descriptions of the discerned structures
should be called - Physical and Semantic Information. Further details on the
subject could be found in my publications on the Research Gate
(https://www.researchgate.net/profile/Emanuel_Diamant) or on my site
(http://www.vidia-mant.info). 

 

To Sungchul Ji (introduced in Pedro C. Marijuan's post from March 23, 2017):


>From your presentation "Planckian information: a new measure of order" I was
pleased to learn something new about Planckian information - a newborn kind
of information. Although you are not familiar with the notion of information
as a complex two-part entity (Physical and Semantic information
subdivisions), you truthfully posit Planckian information as a physical
information exemplar similar to other representatives of the class such as
Shannon information, Fisher, Kolmogorov, Chaitin, and other. 

In your words: "The Planckian information represents the degree of
organization of physical (or nonphysical) systems:", "Planckian information
is primarily concerned with the amount (and hence the quantitative aspect)
of information.  There are numerous ways that have been suggested in the
literature for quantifying information bedside the well-known Hartley
information, Shannon entropy, algorithmic information, etc. " (That is,
Planckian information is one of them (one of the physical information
manifestations), not a foe, not a competitor, not a foreigner or an
outsider).

 

It has to be mentioned that such an approach is not predominant in FIS
discussions. The mainstream way of thinking looks like this: "complaining
about Shannon entropy as a measure of information is completely justified
because it is steam-engine physics unfortunately still widely used despite
its many flaws and limitations"; and further "Shannon entropy should not
even be mentioned any longer in serious discussions about information"
(http://listas.unizar.es/pipermail/fis/2016-June/001039.html). And finally:
"(there is an) urgent need to move away from entropy towards algorithmic

[Fis] a limited response

2017-03-28 Thread Emanuel Diamant
Dear FIS colleagues,

 

As usual, I would like to begin with apologies. I apologize that because of
the gaps in my education I can only partially understand what is being said
in most of your mails. Therefore, I will only partially respond to those
segments of your posts that seem to me to be in the limits of my
understanding. 

 

To Karl Javorszky: 

At March 24, you wrote: "I have given in my work "Natural orders - de
ordinibus naturalibus" (ISBN  9783990571378) the following definition of the
term "information": Information is a description of what is not the case". 

I do not know "what is not the case", but I salute and welcome your
statement that "Information is a description." I am also using (for a quite
a long time now) a similar definition: "Information is a description of
structures observable in a given data set". 

By saying this, I do not pretend to claim for priority or credits - all
credits must be directed to A. Kolmogorov who in his 1965 paper "Three
approaches to the quantitative definition of information" was the first who
has introduced the concept.

As all the other researchers of his time, Kolmogorov has developed his
information quantity measure for a linear one-dimensional communication
message data set. I have expanded and extended Kolmogorov's definition to a
two-dimensional data set. In a two-dimensional data set two types of
structures could be distinguished: primary (basic) data structures and
secondary (meaningful structures of structures) data arrangements. According
to the offered definition the descriptions of the discerned structures
should be called - Physical and Semantic Information. Further details on the
subject could be found in my publications on the Research Gate
(https://www.researchgate.net/profile/Emanuel_Diamant) or on my site
(http://www.vidia-mant.info). 

 

To Sungchul Ji (introduced in Pedro C. Marijuan's post from March 23, 2017):


>From your presentation "Planckian information: a new measure of order" I was
pleased to learn something new about Planckian information - a newborn kind
of information. Although you are not familiar with the notion of information
as a complex two-part entity (Physical and Semantic information
subdivisions), you truthfully posit Planckian information as a physical
information exemplar similar to other representatives of the class such as
Shannon information, Fisher, Kolmogorov, Chaitin, and other. 

In your words: "The Planckian information represents the degree of
organization of physical (or nonphysical) systems.", "Planckian information
is primarily concerned with the amount (and hence the quantitative aspect)
of information.  There are numerous ways that have been suggested in the
literature for quantifying information bedside the well-known Hartley
information, Shannon entropy, algorithmic information, etc. " (That is,
Planckian information is one of them (one of the physical information
manifestations), not a foe, not a competitor, not a foreigner or an
outsider).

 

It has to be mentioned that such an approach is not predominant in FIS
discussions. The mainstream way of thinking looks like this: "complaining
about Shannon entropy as a measure of information is completely justified
because it is steam-engine physics unfortunately still widely used despite
its many flaws and limitations"; and further "Shannon entropy should not
even be mentioned any longer in serious discussions about information"
(http://listas.unizar.es/pipermail/fis/2016-June/001039.html). And finally:
"(there is an) urgent need to move away from entropy towards algorithmic
information" (http://sciforum.net/conference/IS4SI-2017/isis-ICPI%202017). 

 

I hope these unfriendly winds will not make an impression on you. I wish you
a speedy and a comfortable accommodation in the FIS community. 

 

Best regards,

Emanuel Diamant.

 

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis