Re: [Fis] Reply to Ted Goranson: levels of description

2006-06-11 Thread Ted Goranson

John Collier wrote on 6/10/06:

At 05:35 PM 6/10/2006, Stanley N. Salthe wrote:


John said:

 > Hmm. You should read Barwise and Seligman, Information Flow...

...
It depends what you mean by logic. The issue is too complicated to 
get into here and now, but the simple answer is that there is no 
non-arbitrary distinction between mathematics and logic. Exactly 
the same reasons apply to the limits of both, and the only way to 
get one more powerful than the other is to apply a double standard 
for proofs and/or acceptability.


Thank you, John.  Good insight.

I sponsored a workshop on a topic near to this, during which Barwise 
said much the same thing. It seems to me that mathematics and logic 
are siblings, perhaps cojoined. I suppose there are other siblings 
not so human-friendly, used by natural objects. Information seems to 
be the light by which we might see them by their shadows.


I'm not surprised that most physicists want to ontologically flatten 
everything into a QM-described truth. What does surprise me is that 
no one has mentioned the inconvenient fact that gravity, that most 
prevalent force in physics, is notably unfriendly to QM.


Best, Ted
--
__
Ted Goranson
Sirius-Beta
___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: [Fis] Reply to Ted Goranson: levels of description

2006-06-10 Thread John Collier

At 05:35 PM 6/10/2006, Stanley N. Salthe wrote:


John said:

> Hmm. You should read Barwise and Seligman, Information Flow: the logic of
>distributed Systems. Very important for understanding Quantum Information.
>Also, I assume that you are familiar with algorithmic complexity theory,
>which is certainly rigourous, Minimum Description Length (Rissanen) and
>Minimum Message Length (Wallace and Dowe) methods that apply Kolomogorov
>and Chaitin's ideas very rigourously. If you don't like the computational
>approaches for some reason, then you might want to look at Ingarden et al,
>(1997) Information Dynamics and Open Systems (Dordrecht: Kluwer). They
>show how probability can be derived from Boolean structures, which are
>based on the fundamental notion of information theory, that of making a
>binary distinction. So probability is based in information theory, not the
>other way around (there are other ways to show this, but I take the
>Ingarden et al approach as conclusive -- Chaitin and Kolmogorov and
>various commentators have observed the same thing). If you think about the
>standard foundations of probability theory, whether Bayesian subjective
>approaches or various objective approaches (frequency approaches fail for
>a number of reasons -- so they are out, but could be a counterexample to
>what I say next), then you will see that making distinctions and/or the
>idea of information present but not accessible are the grounds for
>probability theory. Information theory is the more fundamental notion,
>logically, it is more general, but includes probability theory as a
>special case. Information can be defined directly in terms of distinctions
>alone; probability cannot. We need to construct a measure to do that.

 So, I ask a follow-up question: Would the greater generality of
information theory with respect to probability theory imply something
concerning the even more general question of whether or not logic is more
general than mathematics?  Some seem to think that Goedel showed the
opposite.


It depends what you mean by logic. The issue is too complicated to 
get into here and now, but the simple answer is that there is no 
non-arbitrary distinction between mathematics and logic. Exactly the 
same reasons apply to the limits of both, and the only way to get one 
more powerful than the other is to apply a double standard for proofs 
and/or acceptability. So the answer, briefly, is that Goedel showed 
no such thing, either way you take it, if you do not apply a double 
standard for evidence.


John


--
Professor John Collier [EMAIL PROTECTED]
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292   F: +27 (31) 260 3031
http://www.nu.ac.za/undphil/collier/index.html  


___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: [Fis] Reply to Ted Goranson: levels of description

2006-06-10 Thread Stanley N. Salthe
John said:

> Hmm. You should read Barwise and Seligman, Information Flow: the logic of
>distributed Systems. Very important for understanding Quantum Information.
>Also, I assume that you are familiar with algorithmic complexity theory,
>which is certainly rigourous, Minimum Description Length (Rissanen) and
>Minimum Message Length (Wallace and Dowe) methods that apply Kolomogorov
>and Chaitin's ideas very rigourously. If you don't like the computational
>approaches for some reason, then you might want to look at Ingarden et al,
>(1997) Information Dynamics and Open Systems (Dordrecht: Kluwer). They
>show how probability can be derived from Boolean structures, which are
>based on the fundamental notion of information theory, that of making a
>binary distinction. So probability is based in information theory, not the
>other way around (there are other ways to show this, but I take the
>Ingarden et al approach as conclusive -- Chaitin and Kolmogorov and
>various commentators have observed the same thing). If you think about the
>standard foundations of probability theory, whether Bayesian subjective
>approaches or various objective approaches (frequency approaches fail for
>a number of reasons -- so they are out, but could be a counterexample to
>what I say next), then you will see that making distinctions and/or the
>idea of information present but not accessible are the grounds for
>probability theory. Information theory is the more fundamental notion,
>logically, it is more general, but includes probability theory as a
>special case. Information can be defined directly in terms of distinctions
>alone; probability cannot. We need to construct a measure to do that.

 So, I ask a follow-up question: Would the greater generality of
information theory with respect to probability theory imply something
concerning the even more general question of whether or not logic is more
general than mathematics?  Some seem to think that Goedel showed the
opposite.

STAN



___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: [Fis] Reply to Ted Goranson: levels of description

2006-06-10 Thread James Johnson




With respect to definitions of 
information
(Shannon, Von Neumann, Kolmogorov, etc.)
there is the completely opposite 
approach
of Michael Leyton.  He defines information 
as
causal explanation.  This is very 
powerful
because it is driven by a 
meaning-making
system, i.e., a cognitive system.
 
With respect to quantitative issues, his 
work
uses his group-theoretic methods based 
on
levels of wreath-product sequences.
The wreath products come from 
structural
characterizations of intelligent causal 
explanation.
 
best 
Jim Johnson
 
 
 
 
 
 
 
 
 

  - Original Message - 
  From: 
  John 
  Collier 
  To: FIS 
  Sent: Saturday, June 10, 2006 2:22 
  PM
  Subject: Re: [Fis] Reply to Ted Goranson: 
  levels of description
  At 08:20 AM 6/7/2006, Andrei Khrennikov wrote:
  My comment:Yes, >> 
deeply about the nature of information>>This is the crucial point. 
But as I know there are only two ways todefine information rigorously, 
classical Shannon information, andquantum von Neumann information. In 
fact, all my discussion was aboutthe possibility (if it would be 
possible at all) to reduce the secondone to the first one.I 
understood that very often people speak about information in 
someheuristic sense, but we are not able to proceed rigorously with 
amathematical definition of information. And I know only 
definitionswhich are based on different kinds of entropy and hence 
probability.Hmm. You should read Barwise and Seligman, 
  Information Flow: the logic of distributed Systems. Very important for 
  understanding Quantum Information. Also, I assume that you are familiar with 
  algorithmic complexity theory, which is certainly rigourous, Minimum 
  Description Length (Rissanen) and Minimum Message Length (Wallace and Dowe) 
  methods that apply Kolomogorov and Chaitin's ideas very rigourously. If you 
  don't like the computational approaches for some reason, then you might want 
  to look at Ingarden et al, (1997) Information 
  Dynamics and Open Systems (Dordrecht: Kluwer). They show how probability 
  can be derived from Boolean structures, which are based on the fundamental 
  notion of information theory, that of making a binary distinction. So 
  probability is based in information theory, not the other way around (there 
  are other ways to show this, but I take the Ingarden et al approach as 
  conclusive -- Chaitin and Kolmogorov and various commentators have observed 
  the same thing). If you think about the standard foundations of probability 
  theory, whether Bayesian subjective approaches or various objective approaches 
  (frequency approaches fail for a number of reasons -- so they are out, but 
  could be a counterexample to what I say next), then you will see that making 
  distinctions and/or the idea of information present but not accessible are the 
  grounds for probability theory. Information theory is the more fundamental 
  notion, logically, it is more general, but includes probability theory as a 
  special case. Information can be defined directly in terms of distinctions 
  alone; probability cannot. We need to construct a measure to do 
  that.John
  
  Professor John 
  Collier 
  [EMAIL PROTECTED]Philosophy and Ethics, University of KwaZulu-Natal, 
  Durban 4041 South AfricaT: +27 (31) 260 3248 / 260 
  2292   F: +27 (31) 260 3031http://www.nu.ac.za/undphil/collier/index.html 
  
  

  ___fis mailing 
  listfis@listas.unizar.eshttp://webmail.unizar.es/mailman/listinfo/fis
___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


Re: [Fis] Reply to Ted Goranson: levels of description

2006-06-10 Thread John Collier


At 08:20 AM 6/7/2006, Andrei Khrennikov wrote:
My comment:
Yes, >> deeply about the nature of information>>
This is the crucial point. But as I know there are only two ways to
define information rigorously, classical Shannon information, and
quantum von Neumann information. In fact, all my discussion was
about
the possibility (if it would be possible at all) to reduce the
second
one to the first one.
I understood that very often people speak about information in some
heuristic sense, but we are not able to proceed rigorously with a
mathematical definition of information. And I know only definitions
which are based on different kinds of entropy and hence
probability.
Hmm. You should read Barwise and Seligman, Information Flow: the logic of
distributed Systems. Very important for understanding Quantum
Information. Also, I assume that you are familiar with algorithmic
complexity theory, which is certainly rigourous, Minimum Description
Length (Rissanen) and Minimum Message Length (Wallace and Dowe) methods
that apply Kolomogorov and Chaitin's ideas very rigourously. If you don't
like the computational approaches for some reason, then you might want to
look at Ingarden et al, (1997)
Information Dynamics and Open Systems (Dordrecht: Kluwer). They
show how probability can be derived from Boolean structures, which are
based on the fundamental notion of information theory, that of making a
binary distinction. So probability is based in information theory, not
the other way around (there are other ways to show this, but I take the
Ingarden et al approach as conclusive -- Chaitin and Kolmogorov and
various commentators have observed the same thing). If you think about
the standard foundations of probability theory, whether Bayesian
subjective approaches or various objective approaches (frequency
approaches fail for a number of reasons -- so they are out, but could be
a counterexample to what I say next), then you will see that making
distinctions and/or the idea of information present but not accessible
are the grounds for probability theory. Information theory is the more
fundamental notion, logically, it is more general, but includes
probability theory as a special case. Information can be defined directly
in terms of distinctions alone; probability cannot. We need to construct
a measure to do that.
John





Professor John
Collier
[EMAIL PROTECTED]
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South
Africa
T: +27 (31) 260 3248 / 260 2292   F:
+27 (31) 260 3031

http://www.nu.ac.za/undphil/collier/index.html 

___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis


[Fis] Reply to Ted Goranson: levels of description

2006-06-07 Thread Andrei Khrennikov
  Dear collegues,
This is a part of my discussion with Ted Goranson. In the previous Email
to the FIS- list Ted Goranson wrote:
>>  >>  Any number of such ontological layers are
>>  >>  possible and I suppose as system scale increases
>>  >>  (physical, chemical, biological and so on...) new
>>  >>  ones are added, possibly with constant semantic
>>  >>  distance.
>>  >>  The point here is as stated at the beginning,
>>  >>  that ontological precedence is key in
>>  >>  unwrapping how QM and information inform each
>>  >>  other, if I can use such a reflexive notion.

My reply to him:
>>  >In the orthodox copenhagen interpretation, the main problem is that
it is strongly forbidden to consider onthological levels. There is only
one level -- level of observations. If you want go beyond this layer,
you go by definition beyond science.
>>  >Andrei

His reply to me:
>>  No, my friend, I go beyond Copenhagen, for certain. But modern
>>  thought on the nature of modeling (including theoretical models)
>>  separates out representational issues, perhaps in layers, from
>>  natural behavior. Science is about understanding, at least as I see
>>  it. My letter was one which addresses the understanding of
>>  understanding where QM seems inadequate and FIS interests (at least
>>  as the group was originally defined) are centered.

My comment:
>Here I agree QM with the Copenhagen interpreation is really #end of the
>road of physics (see Karl Popper, Quantum Theory and the Schism in
Physics.)

His reply to me (continuation):
>>  But the online discussion as it is developing seems not to worry too
>>  deeply about the nature of information, so perhaps I leave the
letter as a marker for a future discussion.

My comment:
Yes, >> deeply about the nature of information>>
This is the crucial point. But as I know there are only two ways to
define information rigorously, classical Shannon information, and
quantum von Neumann information. In fact, all my discussion was about
the possibility (if it would be possible at all) to reduce the second
one to the first one.

I understood that very often people speak about information in some
heuristic sense, but we are not able to proceed rigorously with a
mathematical definition of information. And I know only definitions
which are based on different kinds of entropy and hence probability.

Andrei

___
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis