RE: [FIS] General remark

2006-10-27 Thread Loet Leydesdorff






  And to reiterate again. We are talking about 
  information as a concept, or as a variable? If we talk variable, we should be 
  aware of the above listed limitations. If we talk concept, than 
  Shannon-Boltzmann is a misunderstanding, in the same way, as the object as a 
  whole and the mass of an object (in kilograms) are not the same.
   
We can consider a concept as a variable which is 
measured at the nominal level, that is, in terms of descriptors. The advantage 
of Shannon's (not Boltzmann's) definition seems to me that it formalizes 
information as a variable. It can be provided with meaning, namely: uncertainty. 
However, this meaning is not yet substantive like the information impact of a 
meaningful information on the stock exchange. Meaning can only be provided to 
the Shannon-type information by a system.
 
With best wishes, 
 
 
Loet






Loet Leydesdorff 
Amsterdam School of Communications Research (ASCoR)Kloveniersburgwal 48, 
1012 CX AmsterdamTel.: +31-20- 525 6598; 
fax: +31-20- 525 3681 [EMAIL PROTECTED] ; http://www.leydesdorff.net/ 


 
Now available: The Knowledge-Based Economy: Modeled, Measured, 
Simulated. 385 pp.; US$ 
18.95 The Self-Organization of the Knowledge-Based 
Society; The Challenge of Scientometrics
 
 
___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


[Fis] Reply to Igor Rojdestvenski: Information Coordinate System

2006-10-27 Thread Andrei Khrennikov
  Dear Igor,

I practically agree with you, especially that matter (biological, non-
biological, whatever) is a derivative concept, for we can speculate 
about it only indirectly through information we possess. As I pointed a 
few times, the objective reality for me is not reality of material
objects (I have even a book about this). This is reality of 
information. There are information laws. Special forms of such laws are 
physical and biological laws. Yes, I agree that Shannon information 
given through entropy and hence through probability is not information, 
but we can say information coordinate. 
In your terminology the problem that I would like to emphasize is that 
we need more coordinates. I do not know such an advanced information 
coordinate system. QI differs from classical by using not a fixed 
Kolmogorov probability space, but multi-probabilistic system. So QI 
provides a better coordinate system, but I do not think that this was 
the end of information coordinate story.

With Best Regards,

Andrei Khrennikov

Director of International Center for Mathematical Modeling in Physics, 
Engineering, Economy and Cognitive Sc.,
University of Vaxjo, Sweden
___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


[FIS] General remark

2006-10-27 Thread Igor Rojdestvenski



Colleagues, 
 
Again a few sporadic remarks. We are always, as it 
seems to me, entangled in the matter-information frame of thinking. Why do not 
we simply take that matter as such presents itself  to us only as 
information and through information (please give counterexamples if you want). 

 
Hence, matter (biological, non-biological, 
whatever) is a derivative concept, for we can speculate about it only indirectly 
through information we possess. A good "falsifying" example is given by the 
General Theory of Relativity, which explicitly prohibits distinguishing cases of 
accelerated movement and movement in the gravity field. 
 
Hence, much of what we used to call "matter" is, in 
fact, enveloped in the concept of information, as well as much more. 

 
 
As to the Shannon and/or Boltsmann 
probabilistic information, this is not information as such but a certain 
variable, used to measure information. It is the same as for measurements of 
matter we use such variables as weight, volume, density and so on. 
 
We should more clearly separate what is the subject 
proper, and what are our speculations about it, our models of it and our 
suggestions on how to weigh it. 
 
Otherwise we are little better than the student, 
which once gave me the following definition: "The Energy Concervation Law EQUALS 
a sum of kinetic and potential energy". 
 
Now about measures of information. When we talk 
physical, Shannon-Boltzmann definition kind of works -- again not as definition 
but as the way to measure, evaluate, estimate. A little deviation, for example 
the information contained in a certain text, and we are at a loss. Why? Because 
we can measure
a) The "bit" information content 
(Shannon)
b) A multitude of information contents based on 
different dictionaries. 
c) Object-specific impact of information. A very 
short phrase, containing a few bits of information, may throw up or down the 
whole stock market (then this information impact is measured in billions of 
dollars) or get a nation into war (then the impact is measured in damage and 
loss of lives). A very long citation from "Catch-22" will certainly not have the 
same impact.
 
And to reiterate again. We are talking about 
information as a concept, or as a variable? If we talk variable, we should be 
aware of the above listed limitations. If we talk concept, than 
Shannon-Boltzmann is a misunderstanding, in the same way, as the object as a 
whole and the mass of an object (in kilograms) are not the same.
 
Yours, Igor Rojdestvenski
 
___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


[Fis] posting in the list

2006-10-27 Thread Pedro Marijuan

Hi,

these days some of you are having difficulties to post in the list. Please, 
be patient. We are receiving in this server lots and lots of spam (in my 
own account around 200 each day) and I can do nothing with the computing 
management about lowering the filters. Well, if your message is repeatedly 
rejected, what you can do is ask anyone else to introduce it, or send it to 
me (and I will automatically re-enter it). It is understandable the sense 
of urgency, but we try Fis to be a quiet, reflective list, where messages 
can be posted today, or tomorrow...


yours,

Pedro

PS. By the way, Joseph Tainter (author of "The collapse of complex 
societies") has agreed to chair a fis session on "Social Complexity" around 
beginnings of December. Thanks are due to Igor Matutinovic for his kind 
help in the preparation of this session.


___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Fw: [FIS] Comment to Karl's reply to Andrei

2006-10-27 Thread Arne Kjellman


 
- Original Message - 
From: Arne Kjellman 
To: Karl Javorszky ;
fis 
Sent: Friday, October 27, 2006 10:52 AM
Subject: [FIS] Comment to Karl's reply to Andrei
Comment on Karl’s reply to
Andrei:
please allow me to ask you not to include this person in your statement
" we do not  have at the moment the real understanding of
information. It is always  reduced to the definition of probability,
through entropy. "
I publicly state - and probably this is the best podium to state this -
that I do have at the moment the real understanding of information.
: I guess you here claim you are a realist that has a real (true?)
understanding of information. A bold claim for being a realist to my
mind!

You make a logical error by stating that the idea of information is
always reduced to the definition of probability, through
entropy.
Let us separate the idea of information from its appearances  (like
the idea of fire to one burning fire).
: Well here you make a dualistic statement and immediately accepts
the realist model of speaking and thus assumes the validity of realism –
which is an illegal position attempting to criticize monism or even
science. 
 
The idea of information is that - due to a small inexactitude in the
folding of one- into moredimansional metrics - there is a basic flaw in
our counting system, if we try to use it to understand outside reality
(which you have wisely assumed to exist). 
: If you try to defend realism this is a “wise” and probably
necessary assumption – however if you try to advance science beyond it
state of present paradoxes this is a devastating assumption. 
 
As long as we regard our rational system of counting in itself, like a
measurement instrument on the shelf of the laboratory, it is error-free,
tautologic and exact. As soon as we try to use it to count and measure
the outside, we run into difficulties.
Dealing with these difficulties, one can have following startegies:
* assuming that the outside does not exist at all: Arne's position,
rejected;
A: This is why you cannot understand my position: I have never
claimed that the ‘outside’ never exists – my claim it is illegal to
“speak about” it simply because we cannot feature it in words. “Whereof
one cannot speak, thereof one must be silent” ::Wittgenstein.

* saying that we do not understand it: your position, rejected;
* checking the measurement instruments: my position, useful.
Our measurement instruments count solely and only on units that are
similar to each other.
A: Agree!
 
We disregard the logical
diversity of the impressions we process.
By Darwin's laws, (: here you are a realist again!) we are rewarded
(by increased chances of reproduction) if we recognise the similar in a
multitude which has properties of similarity and dissimilarity. We
perceive the similar before a background of dissimilarity.
: Agree! 
 
That our nervous system is built like this should not discourage us from
investigating the properties of the background, too. We are like moths
being attracted to the light (of similarity) and I am a moth which says:
dark can have differing degrees. Wont we count the degrees of dark? The
answer, usually, is: what, dark! Dont you feel the truth? It is light
that attracts us!
: It’s a choice simply – we can analyse the background as well – I
agree!
  
So, the dialogue does get a bit tedious.
Unfortunately, diversity is NOT exactly the opposite of similarity. One
can count in units of diversity. One can build a counting system based on
units of diversity. This D-based counting system neatly interacts with
the traditional, similarity-based counting system, generating lots of
what people call "natural constants" along the way.
A: I do not know about the D-based counting system (reference?) but I
follow your thoughts. Apart from the divergence mentioned below I think
SOA could benefit from using such a system of quantification. 
So, the idea of information is deeply understood to mean the average
difference (torsion, slack) between counting systems, where one counting
system is based on axiomatis similarity of units, and the other is based
on axiomatic dissimilarity of units. (This is like saying that the basis
of our spatial seeing is the distance between our eyes and that we have
two eyes.)
The realisation of information is best observed by assuming probabilistic
models of distribution of this bias.
A: And here you slips – not in interpreting the D-based counting
system – but in its application. You here fail to take the step necessary
that overthrows the realist’s belief system :: The fire you talking about
– and the domain which you apply your counting system is your personal
EXPERIENCE – not some imaginative reality. This is the only valid use we
can make of an IS-operator (Is-predicate). 
So, please exclude me from your sweeping statement "we don't
understand what information is", and thank you for the opportunity
to add to your statement "we assume it to be a concep

[Fis] Laws of physics do NOT apply in biology

2006-10-27 Thread [EMAIL PROTECTED]
Hi Guy A Hoelzer,

the laws of Newton do not apply in biology. Or, have you ever seen a biologic 
body that remains in an idle state or keeps its linear movement forward?

Karl


___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis