Re: logic mailing list

2009-05-28 Thread Bruno Marchal

Abram,


 Bruno,

 I knew already about combinators, and the basic correspondence between
 arrow-types and material conditionals. If I recall, pairs correspond
 to , right? I do not yet understand about adding quantifiers and
 negation.

 Still, I do not really see the usefulness of this. It is occasionally
 invoked to justify the motto programs are proofs, but it doesn't
 seem like it does any such thing.


The interesting part, which is particularly amazing concerning  
classical logic is not the (quasi-trivial) fact that programs are  
proofs, but that all proofs are programs. All proof becomes program,  
even non constructive proofs, when the CH is extended to classical  
logic. What are those proofs doing, as program, is still rather  
mysterious, and it is a subject in development. But I am a bit out of  
this in the moment. I intend to come back sometime.
Apparently they provide neat semantics for GOTO instruction, TRY  
CATCH, handling of error, continuation passing, etc.

Bruno





 On Tue, May 19, 2009 at 11:25 AM, Bruno Marchal marc...@ulb.ac.be  
 wrote:
 Hi Abram,

 On 18 May 2009, at 21:53, Abram Demski wrote:

 Bruno,

 I know just a little about the curry-howard isomorphism... I looked
 into it somewhat, because I was thinking about the possibility of
 representing programs as proof methods (so that a single run of the
 program would correspond to a proof about the relationship between  
 the
 input and the output). But, it seems that the curry-howard
 relationship between programs and proofs is much different than  
 what I
 was thinking of.

 Let me give the shorter and simple example. Do you know the  
 combinators?  I
 have explained some time ago on this list that you can code all  
 programs in
 the SK combinator language. The alphabet of the language is (,  
 ), K S.
 Variables and equality sign = are used at the metalevel and does  
 not
 appear in the program language.
 The syntax is given by the (meta)definition:
 K is a combinator
 S is a combinator
 if x and y are combinators then (x y) is a combinator.
 The idea is that all combinators represent a function of one  
 argument, from
 the set of all combinators in the set of all combinators, and ( x y)
 represents the result of applying x to y. To increase readability  
 the left
 parenthesis are not written, so ab(cd)e is put for a b) (c d)) e)
 So example of combinators are: K, S, KK, KS, SK, SS, KKK, K(KK),   
 etc.
  Remember that KKK is ((KK)K).
 The (meta)axioms (or the scheme of axioms, with x and y being any
 combinators) are
 Kxy = x
 Sxyz = xz(yz)
 If you give not the right number of argument, the combinators  
 give the
 starting expression (automated currying): so SK gives SK, for  
 example. But
 KKK gives K, and SKSK gives KK(SK) which gives K. OK?
 The inference rule of the system are simply the equality rule: from  
 x = y
 you can deduce y = x, and from x = y and y = z you can deduce x = z,
 together with: from x = y you can deduce xz = yz, and, from x = y  
 you can
 deduce zx = zy.
 This gives already a very powerful theory in which you can prove all
 Sigma_sentences (or equivalent). It defines a universal dovetailer,  
 and
 adding some induction axioms gives a theory at least as powerful as  
 Peano
 Arithmetic.
 See my Elsevier paper Theoretical computer science and the natural  
 sciences
 for a bit more. Or see
 http://www.mail-archive.com/everything-list@googlegroups.com/msg05920.html
 and around.
 The Curry Howard isomorphism arrives when you introduce types on the
 combinators. Imagine that x is of type a and y is of type b, so that
 a combinator which would transform x into y would be of type a - b.
 What is the type of K? (assuming again that x if of type a and y is  
 of type
 b). You see that Kx on y gives x, so K take an object of type a,  
 (x), and
 transforms it into an object (Kx) which transform y in x, so K  
 takes an
 object of type a, and transform it into an object of type (b - a),  
 so K is
 of type
 a - (b - a)
 And you recognize the well known a fortiori axioms of classical  
 (and
 intuitionistic) logic. If you proceed in the same way for S, you  
 will see
 that S is of type
 (a - (b - c)) - ((a - b) - (a - c))
 And you recognize the arrow transitivity axiom, again a classical  
 logic
 tautology (and a well accepted intuistionistic formula). So you see  
 that
 typing combinators gives propositional formula. But something else  
 happens:
 if you take a combinator, for example the simplest one, I, which  
 compute the
 identity function Ix = x. It is not to difficult to program I  
 with S and
 K, you will find SKK (SKKx = Kx(Kx) = x). Now the step which leads  
 to the
 program SKK, when typed, will give the (usual) proof of the  
 tautology a - a
 from the a fortiori axiom and the transitivity axiom. The rules  
 works very
 well for intuitionistic logic associating type to logical formula,  
 and proof
 to programs. The application rule of combinators correspond to the  
 

Re: logic mailing list

2009-05-28 Thread Abram Demski

Bruno,

 What are those proofs doing, as program, is still rather
 mysterious, and it is a subject in development.

This is what makes the use of such things hard to understand for me. :) Ah well.

I suppose the correspondance is useful in establishing the
computational complexity (and computability where that is in question)
of type checking?

--Abram

On Thu, May 28, 2009 at 4:07 PM, Bruno Marchal marc...@ulb.ac.be wrote:

 Abram,


 Bruno,

 I knew already about combinators, and the basic correspondence between
 arrow-types and material conditionals. If I recall, pairs correspond
 to , right? I do not yet understand about adding quantifiers and
 negation.

 Still, I do not really see the usefulness of this. It is occasionally
 invoked to justify the motto programs are proofs, but it doesn't
 seem like it does any such thing.


 The interesting part, which is particularly amazing concerning
 classical logic is not the (quasi-trivial) fact that programs are
 proofs, but that all proofs are programs. All proof becomes program,
 even non constructive proofs, when the CH is extended to classical
 logic. What are those proofs doing, as program, is still rather
 mysterious, and it is a subject in development. But I am a bit out of
 this in the moment. I intend to come back sometime.
 Apparently they provide neat semantics for GOTO instruction, TRY
 CATCH, handling of error, continuation passing, etc.

 Bruno





 On Tue, May 19, 2009 at 11:25 AM, Bruno Marchal marc...@ulb.ac.be
 wrote:
 Hi Abram,

 On 18 May 2009, at 21:53, Abram Demski wrote:

 Bruno,

 I know just a little about the curry-howard isomorphism... I looked
 into it somewhat, because I was thinking about the possibility of
 representing programs as proof methods (so that a single run of the
 program would correspond to a proof about the relationship between
 the
 input and the output). But, it seems that the curry-howard
 relationship between programs and proofs is much different than
 what I
 was thinking of.

 Let me give the shorter and simple example. Do you know the
 combinators?  I
 have explained some time ago on this list that you can code all
 programs in
 the SK combinator language. The alphabet of the language is (,
 ), K S.
 Variables and equality sign = are used at the metalevel and does
 not
 appear in the program language.
 The syntax is given by the (meta)definition:
 K is a combinator
 S is a combinator
 if x and y are combinators then (x y) is a combinator.
 The idea is that all combinators represent a function of one
 argument, from
 the set of all combinators in the set of all combinators, and ( x y)
 represents the result of applying x to y. To increase readability
 the left
 parenthesis are not written, so ab(cd)e is put for a b) (c d)) e)
 So example of combinators are: K, S, KK, KS, SK, SS, KKK, K(KK),
 etc.
  Remember that KKK is ((KK)K).
 The (meta)axioms (or the scheme of axioms, with x and y being any
 combinators) are
 Kxy = x
 Sxyz = xz(yz)
 If you give not the right number of argument, the combinators
 give the
 starting expression (automated currying): so SK gives SK, for
 example. But
 KKK gives K, and SKSK gives KK(SK) which gives K. OK?
 The inference rule of the system are simply the equality rule: from
 x = y
 you can deduce y = x, and from x = y and y = z you can deduce x = z,
 together with: from x = y you can deduce xz = yz, and, from x = y
 you can
 deduce zx = zy.
 This gives already a very powerful theory in which you can prove all
 Sigma_sentences (or equivalent). It defines a universal dovetailer,
 and
 adding some induction axioms gives a theory at least as powerful as
 Peano
 Arithmetic.
 See my Elsevier paper Theoretical computer science and the natural
 sciences
 for a bit more. Or see
 http://www.mail-archive.com/everything-list@googlegroups.com/msg05920.html
 and around.
 The Curry Howard isomorphism arrives when you introduce types on the
 combinators. Imagine that x is of type a and y is of type b, so that
 a combinator which would transform x into y would be of type a - b.
 What is the type of K? (assuming again that x if of type a and y is
 of type
 b). You see that Kx on y gives x, so K take an object of type a,
 (x), and
 transforms it into an object (Kx) which transform y in x, so K
 takes an
 object of type a, and transform it into an object of type (b - a),
 so K is
 of type
 a - (b - a)
 And you recognize the well known a fortiori axioms of classical
 (and
 intuitionistic) logic. If you proceed in the same way for S, you
 will see
 that S is of type
 (a - (b - c)) - ((a - b) - (a - c))
 And you recognize the arrow transitivity axiom, again a classical
 logic
 tautology (and a well accepted intuistionistic formula). So you see
 that
 typing combinators gives propositional formula. But something else
 happens:
 if you take a combinator, for example the simplest one, I, which
 compute the
 identity function Ix = x. It is not to difficult to program I
 with S and
 K, you will find 

Re: logic mailing list

2009-05-20 Thread Bruno Marchal


On 20 May 2009, at 00:01, John Mikes wrote:

 As always, thanks, Bruno for taking the time to educate this bum.
 Starting at the bottom:
 To ask a logician the meaning of the signs, (...) is like asking
 the logician what is logic, and no two logicians can agree on the
 possible answer to that question.
 This is why I asked  --  YOUR  -- version.
 *
 Logic is also hard to explain to the layman,...
 I had a didactically gifted boss (1951) who said 'if you understand  
 something to a sufficient depth, you can explain it to any avarage  
 educated person'.
 And here comes my
 counter-example to your AB parable: condition: I have $100 in my  
 purse.
 'A'  means I take out $55 from my purse and it is true.
 'B' means: I take out $65 from my purse - and this is also true.
 AB is untrue (unless we forget about the meaning of  or and . In  
 any language.


As I said you are a beginner. And you confirm my theory that beginner  
can be  great genius! You have just discovered here the field of  
linear logic. Unfortunately the discovery has already been done by  
Jean-Yves Girard, a french logician. Your money example is often used  
by Jean-Yves Girard himself to motivate Linear logic. Actually my  
other motivation for explaining the combinators, besides to exploit  
the Curry Howard isomorphism, was to have a very finely grained notion  
of deduction so as to provide a simple introduction to linear logic.  
In linear logic the rule of deduction are such that the proposition  
A and the proposition A  A are not equivalent. Intuitionistic  
logic can be regain by adding a modal operator, noted ! and read  
of course A, and !A means A  A  A  ...

Now, a presentation of a logic can be long and boring, and I will not  
do it now because it is a bit out of topic. After all I was trying to  
explain to Abram why we try to avoid logic as much as possible in this  
list. But yes, in classical logic you can use the rule which says that  
if you have prove A then you can deduce A  A. For example you can  
deduce, from 1+1 = 2, the proposition 1+1=2  1+1=2. And indeed such  
rules are not among the rule of linear logic. Linear logic is a  
wonderful quickly expanding field with many applications in computer  
science (for quasi obvious reason), but also in knot theory, category  
theory etc.

The fact that you invoke a counterexample shows that you have an  
idea of what (classical) logic is.

But it is not a counter example, you are just pointing to the fact  
that there are many different logics, and indeed there are many  
different logics. Now, just to reason about those logics, it is nice  
to choose one logic, and the most common one is classical logic.

Logician are just scientist and they give always the precise axiom and  
rule of the logic they are using or talking about. A difficulty comes  
from the fact that we can study a logic with that same logic, and this  
can easily introduce confusion of levels.









 *
 I think you are pointing the finger on the real difficulty of logic  
 for beginners
 How else do I begin than a beginner? to learn signs without meaning,  
 then later on develop the rules to make a meaning? My innate common  
 sense refuses to learn anything without meaning. Rules, or not  
 rules. I am just that kind of a revolutionist.


I think everybody agree, but in logic the notion of meaning is also  
studied, and so you have to abstract from the intuitive meaning to  
study the mathematical meaning. Again this needs training.




 Finally, (to begin with)
 ...study of the laws of thought, although I would add probability  
 theory to it ...???
 I discard probability as a count - consideration  inside a limited  
 (cut) model, 'count'
 - also callable: statistics, strictly limited to the given model- 
 content of the counting -
 with a notion (developed in same model) what, or how many the next  
 simialr items MAY be - for which there is no anticipation in the  
 stated circumstances. To anticipate a probability one needs a lot of  
 additional knowledge (and its critique) and it is still applicable  
 only within the said limited model-content.
 Change the boundaries of the model, the content, the statistics and  
 probability will change as well. Even the causality circumstances  
 (so elusive in my views).


I am afraid you are confirming my other theory according to which  
great genius can tell great stupidities (with all my respect  of  
course grin).
Come on John, there are enough real difficulties in what I try to  
convey that coming back on a critic of the notion of probability is a  
bit far stretched.  Einstein discovered the atoms with the Brownian  
motion by using Boltzmann classical physical statistics. I have heard  
that Boltzman killed himself due to the incomprehension of his  
contemporaries in front of that fundamental idea (judged obvious  
today). But today there is no more conceptual problem with most use of  
statistics 'except when used by politicians!).

Re: logic mailing list

2009-05-20 Thread Abram Demski

Bruno,

I knew already about combinators, and the basic correspondence between
arrow-types and material conditionals. If I recall, pairs correspond
to , right? I do not yet understand about adding quantifiers and
negation.

Still, I do not really see the usefulness of this. It is occasionally
invoked to justify the motto programs are proofs, but it doesn't
seem like it does any such thing.

--Abram

On Tue, May 19, 2009 at 11:25 AM, Bruno Marchal marc...@ulb.ac.be wrote:
 Hi Abram,

 On 18 May 2009, at 21:53, Abram Demski wrote:

 Bruno,

 I know just a little about the curry-howard isomorphism... I looked
 into it somewhat, because I was thinking about the possibility of
 representing programs as proof methods (so that a single run of the
 program would correspond to a proof about the relationship between the
 input and the output). But, it seems that the curry-howard
 relationship between programs and proofs is much different than what I
 was thinking of.

 Let me give the shorter and simple example. Do you know the combinators?  I
 have explained some time ago on this list that you can code all programs in
 the SK combinator language. The alphabet of the language is (, ), K S.
 Variables and equality sign = are used at the metalevel and does not
 appear in the program language.
 The syntax is given by the (meta)definition:
 K is a combinator
 S is a combinator
 if x and y are combinators then (x y) is a combinator.
 The idea is that all combinators represent a function of one argument, from
 the set of all combinators in the set of all combinators, and ( x y)
 represents the result of applying x to y. To increase readability the left
 parenthesis are not written, so ab(cd)e is put for a b) (c d)) e)
 So example of combinators are: K, S, KK, KS, SK, SS, KKK, K(KK),  etc.
  Remember that KKK is ((KK)K).
 The (meta)axioms (or the scheme of axioms, with x and y being any
 combinators) are
 Kxy = x
 Sxyz = xz(yz)
 If you give not the right number of argument, the combinators give the
 starting expression (automated currying): so SK gives SK, for example. But
 KKK gives K, and SKSK gives KK(SK) which gives K. OK?
 The inference rule of the system are simply the equality rule: from x = y
 you can deduce y = x, and from x = y and y = z you can deduce x = z,
 together with: from x = y you can deduce xz = yz, and, from x = y you can
 deduce zx = zy.
 This gives already a very powerful theory in which you can prove all
 Sigma_sentences (or equivalent). It defines a universal dovetailer, and
 adding some induction axioms gives a theory at least as powerful as Peano
 Arithmetic.
 See my Elsevier paper Theoretical computer science and the natural sciences
 for a bit more. Or see
 http://www.mail-archive.com/everything-list@googlegroups.com/msg05920.html
 and around.
 The Curry Howard isomorphism arrives when you introduce types on the
 combinators. Imagine that x is of type a and y is of type b, so that
 a combinator which would transform x into y would be of type a - b.
 What is the type of K? (assuming again that x if of type a and y is of type
 b). You see that Kx on y gives x, so K take an object of type a, (x), and
 transforms it into an object (Kx) which transform y in x, so K takes an
 object of type a, and transform it into an object of type (b - a), so K is
 of type
 a - (b - a)
 And you recognize the well known a fortiori axioms of classical (and
 intuitionistic) logic. If you proceed in the same way for S, you will see
 that S is of type
 (a - (b - c)) - ((a - b) - (a - c))
 And you recognize the arrow transitivity axiom, again a classical logic
 tautology (and a well accepted intuistionistic formula). So you see that
 typing combinators gives propositional formula. But something else happens:
 if you take a combinator, for example the simplest one, I, which compute the
 identity function Ix = x. It is not to difficult to program I with S and
 K, you will find SKK (SKKx = Kx(Kx) = x). Now the step which leads to the
 program SKK, when typed, will give the (usual) proof of the tautology a - a
 from the a fortiori axiom and the transitivity axiom. The rules works very
 well for intuitionistic logic associating type to logical formula, and proof
 to programs. The application rule of combinators correspond to the modus
 ponens rule, and the deduction theorem correspond to lambda abstraction. It
 leads thus to a non trivial and unexpected isomorphism between programming
 and proving.
 For a long time this isomorphism was thought applying only to intuitionistic
 logic, but today we know it extends on the whole of classical logic and
 classical theories like set theory. Typical classical rule like Pierce
 axioms ((a - b) - a) - a) gives try-catch error handling procedure, and
 Krivine seems to think that Gödel's theorem leads to decompiler (but this I
 do not yet understand!). This gives constructive or partially constructive
 interpretation of logical formula and theorems, and this is a rather amazing
 

Re: logic mailing list

2009-05-20 Thread John Mikes
Bruno, I cheerfuly accept both of your notations about a genius. Everybody
is one, just some boast about it, others are ashamed. I just accept. I feel
what you call classical logic is my 'common sense' (restricted of the ways
how the average person thinks). Linear logic (Sorry, Jean-Yves Girard, never
heard your name) is not my beef: in my expanded totality vue nothing can be
linear. We 'think' in a reductionist way - in models, i.e. in limited
topical cuts from the totality, becuse our mental capabilities disallow more
- I think pretty linearly.
 I just try to attempt a wider way of consideration (I did not say:
successfully). In such the real 'everything' is present, in unlimited
relations into/with all we think of - without us noticing or even knowing
about it. (Some we don't even know about).
We just follow the given axioms (see below) of the in-model content and stay
limitedly.

When Gerolamo Cardano screwed up the term* 'probability* - as the first one
applying a scientific calculability in his De Ludo Aleae he poisined the
minds by the concept of a - mathematically applicable - homogenous
distribution-based probability  (later: *random,*
 the reason why the contemporaries of Boltzman could not understand him
- before Einstein.) Alas, distributions are not homogenous and random does
not exist in our deterministic (ordered) world (only ignorance about the
'how'-s)
*Statistical* as well are the 'given' distributional counts within the
chosen model- domain.
*Math (applied)* was seeking the calculable, so it was restricted to the
ordered disorder.
If something is fundamentally impredicative (like the final value of pi) I
am thinking of a 'fundamental' ignorance about the conditions of the
description.(cf: 2-slit phenomenon).

*AXIOMS, however, are products of a reversed logic:*
they are devised in order to make our theories applicable and not vice
versa.
My point:
with a different logic, different axioms may be devised and our explanations
of the world may be quite different. E.g. 2+2 is NOT 4. You may call it
'bad' logic, Allowed. What I won't allow is *illogical *unless you checked
ALL (possible and impossible) logical systems.

Reading your enlightening remarks (thank you) I see that I don't need those
'signs' to NOT understand, you did not apply them and I did not understand
your explanatory - lettered and numbered - par. (Why are 'idem per idem' *
not* identical, (as A = A  A) when naming 1+1=2 as A, - from 1+1=2, the
format 1+1=2  1+1=2 is deducible? (Of course I don't know the meaning of
'deducible'.) You also sneaked in the word 'modal' operator, for which I am
too much of a beginner.

That much said: I ask your patience concerning my ignorance in my
questions/remarks on what I think I sort of understood.  I may be 'on the
other side'.

Best regards

John





On Wed, May 20, 2009 at 10:43 AM, Bruno Marchal marc...@ulb.ac.be wrote:



 On 20 May 2009, at 00:01, John Mikes wrote:

  As always, thanks, Bruno for taking the time to educate this bum.
  Starting at the bottom:
  To ask a logician the meaning of the signs, (...) is like asking
  the logician what is logic, and no two logicians can agree on the
  possible answer to that question.
  This is why I asked  --  YOUR  -- version.
  *
  Logic is also hard to explain to the layman,...
  I had a didactically gifted boss (1951) who said 'if you understand
  something to a sufficient depth, you can explain it to any avarage
  educated person'.
  And here comes my
  counter-example to your AB parable: condition: I have $100 in my
  purse.
  'A'  means I take out $55 from my purse and it is true.
  'B' means: I take out $65 from my purse - and this is also true.
  AB is untrue (unless we forget about the meaning of  or and . In
  any language.


 As I said you are a beginner. And you confirm my theory that beginner
 can be  great genius! You have just discovered here the field of
 linear logic. Unfortunately the discovery has already been done by
 Jean-Yves Girard, a french logician. Your money example is often used
 by Jean-Yves Girard himself to motivate Linear logic. Actually my
 other motivation for explaining the combinators, besides to exploit
 the Curry Howard isomorphism, was to have a very finely grained notion
 of deduction so as to provide a simple introduction to linear logic.
 In linear logic the rule of deduction are such that the proposition
 A and the proposition A  A are not equivalent. Intuitionistic
 logic can be regain by adding a modal operator, noted ! and read
 of course A, and !A means A  A  A  ...

 Now, a presentation of a logic can be long and boring, and I will not
 do it now because it is a bit out of topic. After all I was trying to
 explain to Abram why we try to avoid logic as much as possible in this
 list. But yes, in classical logic you can use the rule which says that
 if you have prove A then you can deduce A  A. For example you can
 deduce, from 1+1 = 2, the proposition 1+1=2  1+1=2. And indeed such
 rules are 

Re: logic mailing list

2009-05-19 Thread John Mikes
Abram:
Maybe you started at an earlier age and at the 'beginning' (school?).
 I used my own common sense logic with my 2 doctorates in a 1/2 century
successful RD activity in natural sciencences and THEN tried to barge into
scientific logics in medias res. My mistake.
Now - another 1/4 c. later I don't feel like staring to change my ways of
thinking - anew.
Please, count me out.

John



On Mon, May 18, 2009 at 3:39 PM, Abram Demski abramdem...@gmail.com wrote:


  I was looking at a dozen books as well and did not find those signes
  explained, not in footnotes, not in appendicis, not as intro- or post-
  chapters. They were just applied from page 1.
  So I gave up.

 That's funny. I never had that experience. There *are* a great many
 signs to learn, but somehow I read all the books in the right order so
 that I know the simpler signs that the more complex signs were being
 explained with. :)

 --Abram

 On Mon, May 18, 2009 at 3:00 PM, John Mikes jami...@gmail.com wrote:
  Bruno:
 
  could you tell in one sentence YOUR identification for logic?
  (I can read the dictionaries, Wiki, etc.)
  I always say :common sense, but what I am referring to is
 -- --  M Y -- -- common sense,
  distorted - OK, interpreted - according to my genetic built, my
 experience
  (sum of memories), instinctive/emotional traits and all the rest ab out
 what
  we have no idea  today yet.
 
  I never studied 'formal' logic, because I wanted to start on my own
 (online
  mostly) and ALL started using signs not even reproducible on keyboards
 and
  not explained what they are standing for. As I guessed: the 'professors'
  issued notes at the beginning of the college-courses (($$s?)) and THERE
  the students could learn the 'vocabulary' of those signs.
  You also use some of them.
 
  I was looking at a dozen books as well and did not find those signes
  explained, not in footnotes, not in appendicis, not as intro- or post-
  chapters. They were just applied from page 1.
  So I gave up.
 
  John M
 
  On Mon, May 18, 2009 at 12:54 PM, Bruno Marchal marc...@ulb.ac.be
 wrote:
 
  Hi Abram,
 
 
  On 24 Apr 2009, at 18:55, Abram Demski wrote:
 
  
   I'm starting a mailing list for logic, and I figured some people from
   here might be interested.
  
   http://groups.google.com/group/one-logic
 
  Interesting! Thanks for the link. But logic is full of mathematical
  mermaids and I am personally more problem driven. I may post some day
  an argument for logical pluralism (even a classical logical argument
  for logical pluralism!), though. Ah! but you can easily guess the
  nature of the argument ...
 
 
  
  
   I've looked around for a high-quality group that discusses these
   things, but I haven't really found one. The logic-oriented mailing
   lists I've seen are either closed to the public (being only for
   professional logicians, or only for a specific university), or
   abandoned, filled with spam, et cetera.
 
 
 
  But it is a very large domain, and a highly technical subject. It is
  not taught in all the universities. It is not a well known subject.
  Unlike quantum mechanics and theoretical computer science, the
  difficulty is in grasping what the subject is about.
  It take time to understand the difference between formal implication
  and deduction. I have problem to explain the difference between
  computation and description of computation ...
 
 
 
 
   So, I figured, why not try to
   start my own?
 
 
  Why not?  Actually I have many questions in logic, but all are
  technical and long to explain. Some have been solved by Eric, who then
  raised new interesting question.
 
  Have you heard about the Curry Howard isomorphism? I have send posts
  on this list on the combinators, and one of the reason for that is
  that combinators can be used for explaining that CH correspondence
  which relates in an amazing way logic and computer science.
 
  Do you know Jean-Louis Krivine? A french logician who try to extend
  the CH (Curry Howard) isomorphism on classical logic and set theory. I
  am not entirely convinced by the details but I suspect something quite
  fundamental and important for the future of computer science and logic.
  You can take a look, some of its paper are in english.
  http://www.pps.jussieu.fr/~krivine/
  Jean-Louis Krivine wrote also my favorite book in set theory.
  The CH correspondence of the (classical) Pierce law as a comp look!
 
  Don't hesitate to send us link to anything relating computer science
  and logic (like the Curry-Howard isomorphism), because, although I
  doubt it can be used easily in our framework, in a direct way, it
  could have some impact in the future.  Category theory is a very nice
  subject too, but is a bit technically demanding at the start. Yet, it
  makes possible to link knot theory, quantum computation, number
  theory, gravity, ...
  Not yet consciousness, though. Intensional free mathematics still
  resist ...
 
 
  
  
   In fact, I originally joined this 

Re: logic mailing list

2009-05-19 Thread Bruno Marchal
Hi Abram,


On 18 May 2009, at 21:53, Abram Demski wrote:


 Bruno,

 I know just a little about the curry-howard isomorphism... I looked
 into it somewhat, because I was thinking about the possibility of
 representing programs as proof methods (so that a single run of the
 program would correspond to a proof about the relationship between the
 input and the output). But, it seems that the curry-howard
 relationship between programs and proofs is much different than what I
 was thinking of.


Let me give the shorter and simple example. Do you know the  
combinators?  I have explained some time ago on this list that you can  
code all programs in the SK combinator language. The alphabet of the  
language is (, ), K S. Variables and equality sign = are used at  
the metalevel and does not appear in the program language.

The syntax is given by the (meta)definition:

K is a combinator
S is a combinator
if x and y are combinators then (x y) is a combinator.

The idea is that all combinators represent a function of one argument,  
from the set of all combinators in the set of all combinators, and ( x  
y) represents the result of applying x to y. To increase readability  
the left parenthesis are not written, so ab(cd)e is put for a b)  
(c d)) e)

So example of combinators are: K, S, KK, KS, SK, SS, KKK, K(KK),   
etc.Remember that KKK is ((KK)K).

The (meta)axioms (or the scheme of axioms, with x and y being any  
combinators) are

Kxy = x
Sxyz = xz(yz)

If you give not the right number of argument, the combinators give  
the starting expression (automated currying): so SK gives SK, for  
example. But KKK gives K, and SKSK gives KK(SK) which gives K. OK?

The inference rule of the system are simply the equality rule: from x  
= y you can deduce y = x, and from x = y and y = z you can deduce x =  
z, together with: from x = y you can deduce xz = yz, and, from x = y  
you can deduce zx = zy.

This gives already a very powerful theory in which you can prove all  
Sigma_sentences (or equivalent). It defines a universal dovetailer,  
and adding some induction axioms gives a theory at least as powerful  
as Peano Arithmetic.

See my Elsevier paper Theoretical computer science and the natural  
sciences
for a bit more. Or see
http://www.mail-archive.com/everything-list@googlegroups.com/msg05920.html 
  and around.

The Curry Howard isomorphism arrives when you introduce types on the  
combinators. Imagine that x is of type a and y is of type b, so that
a combinator which would transform x into y would be of type a - b.

What is the type of K? (assuming again that x if of type a and y is of  
type b). You see that Kx on y gives x, so K take an object of type a,  
(x), and transforms it into an object (Kx) which transform y in x, so  
K takes an object of type a, and transform it into an object of type  
(b - a), so K is of type

a - (b - a)

And you recognize the well known a fortiori axioms of classical (and  
intuitionistic) logic. If you proceed in the same way for S, you will  
see that S is of type

(a - (b - c)) - ((a - b) - (a - c))

And you recognize the arrow transitivity axiom, again a classical  
logic tautology (and a well accepted intuistionistic formula). So you  
see that typing combinators gives propositional formula. But something  
else happens: if you take a combinator, for example the simplest one,  
I, which compute the identity function Ix = x. It is not to difficult  
to program I with S and K, you will find SKK (SKKx = Kx(Kx) = x).  
Now the step which leads to the program SKK, when typed, will give the  
(usual) proof of the tautology a - a from the a fortiori axiom and  
the transitivity axiom. The rules works very well for intuitionistic  
logic associating type to logical formula, and proof to programs. The  
application rule of combinators correspond to the modus ponens rule,  
and the deduction theorem correspond to lambda abstraction. It leads  
thus to a non trivial and unexpected isomorphism between programming  
and proving.
For a long time this isomorphism was thought applying only to  
intuitionistic logic, but today we know it extends on the whole of  
classical logic and classical theories like set theory. Typical  
classical rule like Pierce axioms ((a - b) - a) - a) gives try- 
catch error handling procedure, and Krivine seems to think that  
Gödel's theorem leads to decompiler (but this I do not yet  
understand!). This gives constructive or partially constructive  
interpretation of logical formula and theorems, and this is a rather  
amazing facts.





 In the end, I don't really see any *use* to the
 curry-howard isomorphism!

I thought a long time that it can be used only in program  
specification and verification analysis in conventional programming,  
but the fact that the CH iso extends to classical logic forces me to  
revise my opinion. Krivine seems to believe it transforms the field of  
logic into a study of the brain as a decompiler or desassembler of  

Re: logic mailing list

2009-05-19 Thread Bruno Marchal

Hi John,


On 18 May 2009, at 21:00, John Mikes wrote:

 Bruno:

 could you tell in one sentence YOUR identification for logic?

That is a difficult question, and to be honest, I am still searching.  
As a platonist (that is: classical logician) I am OK with the idea  
that logic is the abstract (domain independent) study of the laws of  
thought, although I would add probability theory to it (like Boole).



 (I can read the dictionaries, Wiki, etc.)
 I always say :common sense, but what I am referring to is
-- --  M Y -- -- common sense,
 distorted - OK, interpreted - according to my genetic built, my  
 experience (sum of memories), instinctive/emotional traits and all  
 the rest ab out what we have no idea  today yet.

I can agree with this, althought the aim is too suppress as far as  
possible the distortion.



 I never studied 'formal' logic, because I wanted to start on my own  
 (online mostly) and ALL started using signs not even reproducible on  
 keyboards

That is why Knuth invented LATEX :)



 and not explained what they are standing for. As I guessed: the  
 'professors' issued notes at the beginning of the college-courses  
 (($$s?)) and THERE the students could learn the 'vocabulary' of  
 those signs.
 You also use some of them.

I think you are pointing the finger on the real difficulty of logic  
for beginners. You are supposed not to attribute meaning on those  
signs, because what the logician is searching for is rule of reasoning  
which does not depend on the meaning. The hardness of logic is in the  
understanding of what you have to NOT understand to proceed. Logicians  
take the signs as just that: sign, without meaning. Then they will  
develop rule of transformation of those sign, in such a way that  
machine can play with them, and mathematical rule of meaning, and they  
are happy when they succeed to find nice correspondence between rule  
and meaning. It makes the subject both very concrete and abstract at  
the same time. I am used to think that logic is the most difficult  
branch of math. Somehow, computer science makes it more easy. It  
motivates the point of not trying to put meaning where none is  
supposed to be.





 I was looking at a dozen books as well and did not find those signes  
 explained, not in footnotes, not in appendicis, not as intro- or  
 post- chapters. They were just applied from page 1.
 So I gave up.

I can understand. It is hard to study logic alone. Yet there are good  
books, but it takes some effort to understand where you have to take  
those sign literally. I would suggest the reading of the little  
penguin book by Hodges Logic, which is perhaps clear for an  
introduction.
Logic is laso hard to explain to the layman, because it concerns  
objects which look like formal things, and it takes time to understand  
that we study those objects without interpreting them. beginners take  
time to understand the use of saying that, for example, we will say  
that A  B is true when A is true and B is true. They can believe  
that they learn nothing here, but they are false because the  is  
formal, and the and is informal. After all you do learn something if  
I say that A et B is true when A is true and B is true. In this case  
you learn french, which are at the same level of informality, but in  
logic you attach rules and meaning to explicitly formal things on  
which you reason *about*.
To ask a logician the meaning of the signs, is a bit like asking a  
biologist the meaning of ATTAGTTCAATCCCT or DNA. It is like asking  
the logician what is logic, and no two logicians can agree on the  
possible answer to that question.  When student ask some question  
here, it is not rare the answer he get is just we are not doing  
philosophy here. The object study is far more concrete than beginners  
can imagine, and that is why the notion of machine and computer  
science can help a lot for many parts of logic.

Bruno


http://iridia.ulb.ac.be/~marchal/




--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en
-~--~~~~--~~--~--~---



Re: logic mailing list

2009-05-19 Thread John Mikes
As always, thanks, Bruno for taking the time to educate this bum.
Starting at the bottom:
To ask a logician the meaning of the signs, (...) is like asking
the logician what is logic, and no two logicians can agree on the
possible answer to that question.
*This is why I asked  --  YOUR  -- version.*
***
Logic is also hard to explain to the layman,...
*I had a didactically gifted boss (1951) who said 'if you understand
something to a sufficient depth, you can explain it to any avarage educated
person'. *
*And here comes my *
*counter-example to your AB parable: condition: I have $100 in my purse.*
*'A' ** means I take out $55 from my purse and it is true. *
*'B' means: I take out $65 from my purse - and this is also true. *
*AB is untrue (unless we forget about the meaning of  or and . In any
language.*
***
**I think you are pointing the finger on the real difficulty of logic for
beginners
*How else do I begin than a beginner? to learn signs without meaning, then
later on develop the rules to make a meaning? My innate common sense refuses
to learn anything without meaning. Rules, or not rules. I am just that kind
of a revolutionist. *
*Finally, (to begin with) *
*...*study of the laws of thought, although I would add probability theory
to it ...???
*I discard probability as a count - consideration  inside a limited (cut)
model, 'count'*
*- also callable: statistics, strictly limited to the given model-content of
the counting - *
*with a notion (developed in same model) what, or how many the next simialr
items MAY be - for which there is no anticipation in the stated
circumstances. To anticipate a probability one needs a lot of additional
knowledge (and its critique) and it is still applicable only within the said
limited model-content. *
*Change the boundaries of the model, the content, the statistics and
probability will change as well. Even the causality circumstances (so
elusive in my views).  *
**
*Regards*
*John*

**
**


On Tue, May 19, 2009 at 11:51 AM, Bruno Marchal marc...@ulb.ac.be wrote:


 Hi John,


 On 18 May 2009, at 21:00, John Mikes wrote:

  Bruno:
 
  could you tell in one sentence YOUR identification for logic?

 That is a difficult question, and to be honest, I am still searching.
 As a platonist (that is: classical logician) I am OK with the idea
 that logic is the abstract (domain independent) study of the laws of
 thought, although I would add probability theory to it (like Boole).



  (I can read the dictionaries, Wiki, etc.)
  I always say :common sense, but what I am referring to is
 -- --  M Y -- -- common sense,
  distorted - OK, interpreted - according to my genetic built, my
  experience (sum of memories), instinctive/emotional traits and all
  the rest ab out what we have no idea  today yet.

 I can agree with this, althought the aim is too suppress as far as
 possible the distortion.


 
  I never studied 'formal' logic, because I wanted to start on my own
  (online mostly) and ALL started using signs not even reproducible on
  keyboards

 That is why Knuth invented LATEX :)



  and not explained what they are standing for. As I guessed: the
  'professors' issued notes at the beginning of the college-courses
  (($$s?)) and THERE the students could learn the 'vocabulary' of
  those signs.
  You also use some of them.

 I think you are pointing the finger on the real difficulty of logic
 for beginners. You are supposed not to attribute meaning on those
 signs, because what the logician is searching for is rule of reasoning
 which does not depend on the meaning. The hardness of logic is in the
 understanding of what you have to NOT understand to proceed. Logicians
 take the signs as just that: sign, without meaning. Then they will
 develop rule of transformation of those sign, in such a way that
 machine can play with them, and mathematical rule of meaning, and they
 are happy when they succeed to find nice correspondence between rule
 and meaning. It makes the subject both very concrete and abstract at
 the same time. I am used to think that logic is the most difficult
 branch of math. Somehow, computer science makes it more easy. It
 motivates the point of not trying to put meaning where none is
 supposed to be.




 
  I was looking at a dozen books as well and did not find those signes
  explained, not in footnotes, not in appendicis, not as intro- or
  post- chapters. They were just applied from page 1.
  So I gave up.

 I can understand. It is hard to study logic alone. Yet there are good
 books, but it takes some effort to understand where you have to take
 those sign literally. I would suggest the reading of the little
 penguin book by Hodges Logic, which is perhaps clear for an
 introduction.
 Logic is laso hard to explain to the layman, because it concerns
 objects which look like formal things, and it takes time to understand
 that we study those objects without interpreting them. beginners take
 time to understand the use of saying that, for example, we 

Re: logic mailing list

2009-05-18 Thread Bruno Marchal

Hi Abram,


On 24 Apr 2009, at 18:55, Abram Demski wrote:


 I'm starting a mailing list for logic, and I figured some people from
 here might be interested.

 http://groups.google.com/group/one-logic

Interesting! Thanks for the link. But logic is full of mathematical  
mermaids and I am personally more problem driven. I may post some day  
an argument for logical pluralism (even a classical logical argument  
for logical pluralism!), though. Ah! but you can easily guess the  
nature of the argument ...




 I've looked around for a high-quality group that discusses these
 things, but I haven't really found one. The logic-oriented mailing
 lists I've seen are either closed to the public (being only for
 professional logicians, or only for a specific university), or
 abandoned, filled with spam, et cetera.



But it is a very large domain, and a highly technical subject. It is  
not taught in all the universities. It is not a well known subject.
Unlike quantum mechanics and theoretical computer science, the  
difficulty is in grasping what the subject is about.
It take time to understand the difference between formal implication  
and deduction. I have problem to explain the difference between  
computation and description of computation ...




 So, I figured, why not try to
 start my own?


Why not?  Actually I have many questions in logic, but all are  
technical and long to explain. Some have been solved by Eric, who then  
raised new interesting question.

Have you heard about the Curry Howard isomorphism? I have send posts  
on this list on the combinators, and one of the reason for that is  
that combinators can be used for explaining that CH correspondence  
which relates in an amazing way logic and computer science.

Do you know Jean-Louis Krivine? A french logician who try to extend  
the CH (Curry Howard) isomorphism on classical logic and set theory. I  
am not entirely convinced by the details but I suspect something quite  
fundamental and important for the future of computer science and logic.
You can take a look, some of its paper are in english.
http://www.pps.jussieu.fr/~krivine/
Jean-Louis Krivine wrote also my favorite book in set theory.
The CH correspondence of the (classical) Pierce law as a comp look!

Don't hesitate to send us link to anything relating computer science  
and logic (like the Curry-Howard isomorphism), because, although I  
doubt it can be used easily in our framework, in a direct way, it  
could have some impact in the future.  Category theory is a very nice  
subject too, but is a bit technically demanding at the start. Yet, it  
makes possible to link knot theory, quantum computation, number  
theory, gravity, ...
Not yet consciousness, though. Intensional free mathematics still  
resist ...




 In fact, I originally joined this list hoping for a logic-oriented
 mailing list. I haven't been entirely disappointed there,

You are kind!


 but at the
 same time that isn't what this list is really intended for.

Logic is a very interesting field. Too bad it is not so well known by  
the large public. The everything list is more theory of everything  
oriented. Logic has a big role to play, (assuming comp) but physics,  
cognitive science and even theology can hardly be avoided in a truly  
unifying quest ... And we try to be as less technic as possible, which  
is for me very hard,  ... oscillating between UDA and AUDA.

Best,

Bruno



http://iridia.ulb.ac.be/~marchal/




--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en
-~--~~~~--~~--~--~---



Re: logic mailing list

2009-05-18 Thread John Mikes
Bruno:

could you tell in one sentence YOUR identification for logic?
(I can read the dictionaries, Wiki, etc.)
I always say :common sense, but what I am referring to is
   *-- --  M Y -- -- common sense, *
distorted - OK, interpreted - according to my genetic built, my experience
(sum of memories), instinctive/emotional traits and all the rest ab out what
we have no idea  today yet.

I never studied 'formal' logic, because I wanted to start on my own (online
mostly) and ALL started using signs not even reproducible on keyboards and
not explained what they are standing for. As I guessed: the 'professors'
issued notes at the beginning of the college-courses (($$s?)) and THERE
the students could learn the 'vocabulary' of those signs.
You also use some of them.

I was looking at a dozen books as well and did not find those signes
explained, not in footnotes, not in appendicis, not as intro- or post-
chapters. They were just applied from page 1.
So I gave up.

John M

On Mon, May 18, 2009 at 12:54 PM, Bruno Marchal marc...@ulb.ac.be wrote:


 Hi Abram,


 On 24 Apr 2009, at 18:55, Abram Demski wrote:

 
  I'm starting a mailing list for logic, and I figured some people from
  here might be interested.
 
  http://groups.google.com/group/one-logic

 Interesting! Thanks for the link. But logic is full of mathematical
 mermaids and I am personally more problem driven. I may post some day
 an argument for logical pluralism (even a classical logical argument
 for logical pluralism!), though. Ah! but you can easily guess the
 nature of the argument ...


 
 
  I've looked around for a high-quality group that discusses these
  things, but I haven't really found one. The logic-oriented mailing
  lists I've seen are either closed to the public (being only for
  professional logicians, or only for a specific university), or
  abandoned, filled with spam, et cetera.



 But it is a very large domain, and a highly technical subject. It is
 not taught in all the universities. It is not a well known subject.
 Unlike quantum mechanics and theoretical computer science, the
 difficulty is in grasping what the subject is about.
 It take time to understand the difference between formal implication
 and deduction. I have problem to explain the difference between
 computation and description of computation ...




  So, I figured, why not try to
  start my own?


 Why not?  Actually I have many questions in logic, but all are
 technical and long to explain. Some have been solved by Eric, who then
 raised new interesting question.

 Have you heard about the Curry Howard isomorphism? I have send posts
 on this list on the combinators, and one of the reason for that is
 that combinators can be used for explaining that CH correspondence
 which relates in an amazing way logic and computer science.

 Do you know Jean-Louis Krivine? A french logician who try to extend
 the CH (Curry Howard) isomorphism on classical logic and set theory. I
 am not entirely convinced by the details but I suspect something quite
 fundamental and important for the future of computer science and logic.
 You can take a look, some of its paper are in english.
 http://www.pps.jussieu.fr/~krivine/
 Jean-Louis Krivine wrote also my favorite book in set theory.
 The CH correspondence of the (classical) Pierce law as a comp look!

 Don't hesitate to send us link to anything relating computer science
 and logic (like the Curry-Howard isomorphism), because, although I
 doubt it can be used easily in our framework, in a direct way, it
 could have some impact in the future.  Category theory is a very nice
 subject too, but is a bit technically demanding at the start. Yet, it
 makes possible to link knot theory, quantum computation, number
 theory, gravity, ...
 Not yet consciousness, though. Intensional free mathematics still
 resist ...


 
 
  In fact, I originally joined this list hoping for a logic-oriented
  mailing list. I haven't been entirely disappointed there,

 You are kind!


  but at the
  same time that isn't what this list is really intended for.

 Logic is a very interesting field. Too bad it is not so well known by
 the large public. The everything list is more theory of everything
 oriented. Logic has a big role to play, (assuming comp) but physics,
 cognitive science and even theology can hardly be avoided in a truly
 unifying quest ... And we try to be as less technic as possible, which
 is for me very hard,  ... oscillating between UDA and AUDA.

 Best,

 Bruno



 http://iridia.ulb.ac.be/~marchal/




 


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en

Re: logic mailing list

2009-05-18 Thread Abram Demski

 I was looking at a dozen books as well and did not find those signes
 explained, not in footnotes, not in appendicis, not as intro- or post-
 chapters. They were just applied from page 1.
 So I gave up.

That's funny. I never had that experience. There *are* a great many
signs to learn, but somehow I read all the books in the right order so
that I know the simpler signs that the more complex signs were being
explained with. :)

--Abram

On Mon, May 18, 2009 at 3:00 PM, John Mikes jami...@gmail.com wrote:
 Bruno:

 could you tell in one sentence YOUR identification for logic?
 (I can read the dictionaries, Wiki, etc.)
 I always say :common sense, but what I am referring to is
    -- --  M Y -- -- common sense,
 distorted - OK, interpreted - according to my genetic built, my experience
 (sum of memories), instinctive/emotional traits and all the rest ab out what
 we have no idea  today yet.

 I never studied 'formal' logic, because I wanted to start on my own (online
 mostly) and ALL started using signs not even reproducible on keyboards and
 not explained what they are standing for. As I guessed: the 'professors'
 issued notes at the beginning of the college-courses (($$s?)) and THERE
 the students could learn the 'vocabulary' of those signs.
 You also use some of them.

 I was looking at a dozen books as well and did not find those signes
 explained, not in footnotes, not in appendicis, not as intro- or post-
 chapters. They were just applied from page 1.
 So I gave up.

 John M

 On Mon, May 18, 2009 at 12:54 PM, Bruno Marchal marc...@ulb.ac.be wrote:

 Hi Abram,


 On 24 Apr 2009, at 18:55, Abram Demski wrote:

 
  I'm starting a mailing list for logic, and I figured some people from
  here might be interested.
 
  http://groups.google.com/group/one-logic

 Interesting! Thanks for the link. But logic is full of mathematical
 mermaids and I am personally more problem driven. I may post some day
 an argument for logical pluralism (even a classical logical argument
 for logical pluralism!), though. Ah! but you can easily guess the
 nature of the argument ...


 
 
  I've looked around for a high-quality group that discusses these
  things, but I haven't really found one. The logic-oriented mailing
  lists I've seen are either closed to the public (being only for
  professional logicians, or only for a specific university), or
  abandoned, filled with spam, et cetera.



 But it is a very large domain, and a highly technical subject. It is
 not taught in all the universities. It is not a well known subject.
 Unlike quantum mechanics and theoretical computer science, the
 difficulty is in grasping what the subject is about.
 It take time to understand the difference between formal implication
 and deduction. I have problem to explain the difference between
 computation and description of computation ...




  So, I figured, why not try to
  start my own?


 Why not?  Actually I have many questions in logic, but all are
 technical and long to explain. Some have been solved by Eric, who then
 raised new interesting question.

 Have you heard about the Curry Howard isomorphism? I have send posts
 on this list on the combinators, and one of the reason for that is
 that combinators can be used for explaining that CH correspondence
 which relates in an amazing way logic and computer science.

 Do you know Jean-Louis Krivine? A french logician who try to extend
 the CH (Curry Howard) isomorphism on classical logic and set theory. I
 am not entirely convinced by the details but I suspect something quite
 fundamental and important for the future of computer science and logic.
 You can take a look, some of its paper are in english.
 http://www.pps.jussieu.fr/~krivine/
 Jean-Louis Krivine wrote also my favorite book in set theory.
 The CH correspondence of the (classical) Pierce law as a comp look!

 Don't hesitate to send us link to anything relating computer science
 and logic (like the Curry-Howard isomorphism), because, although I
 doubt it can be used easily in our framework, in a direct way, it
 could have some impact in the future.  Category theory is a very nice
 subject too, but is a bit technically demanding at the start. Yet, it
 makes possible to link knot theory, quantum computation, number
 theory, gravity, ...
 Not yet consciousness, though. Intensional free mathematics still
 resist ...


 
 
  In fact, I originally joined this list hoping for a logic-oriented
  mailing list. I haven't been entirely disappointed there,

 You are kind!


  but at the
  same time that isn't what this list is really intended for.

 Logic is a very interesting field. Too bad it is not so well known by
 the large public. The everything list is more theory of everything
 oriented. Logic has a big role to play, (assuming comp) but physics,
 cognitive science and even theology can hardly be avoided in a truly
 unifying quest ... And we try to be as less technic as possible, which
 is for me very hard,  ... oscillating between 

Re: logic mailing list

2009-05-18 Thread Abram Demski

Bruno,

I know just a little about the curry-howard isomorphism... I looked
into it somewhat, because I was thinking about the possibility of
representing programs as proof methods (so that a single run of the
program would correspond to a proof about the relationship between the
input and the output). But, it seems that the curry-howard
relationship between programs and proofs is much different than what I
was thinking of. In the end, I don't really see any *use* to the
curry-howard isomorphism! Sure, the correspondence is interesting, but
what can we do with it? Perhaps you can answer this.

--Abram

On Mon, May 18, 2009 at 12:54 PM, Bruno Marchal marc...@ulb.ac.be wrote:

 Hi Abram,


 On 24 Apr 2009, at 18:55, Abram Demski wrote:


 I'm starting a mailing list for logic, and I figured some people from
 here might be interested.

 http://groups.google.com/group/one-logic

 Interesting! Thanks for the link. But logic is full of mathematical
 mermaids and I am personally more problem driven. I may post some day
 an argument for logical pluralism (even a classical logical argument
 for logical pluralism!), though. Ah! but you can easily guess the
 nature of the argument ...




 I've looked around for a high-quality group that discusses these
 things, but I haven't really found one. The logic-oriented mailing
 lists I've seen are either closed to the public (being only for
 professional logicians, or only for a specific university), or
 abandoned, filled with spam, et cetera.



 But it is a very large domain, and a highly technical subject. It is
 not taught in all the universities. It is not a well known subject.
 Unlike quantum mechanics and theoretical computer science, the
 difficulty is in grasping what the subject is about.
 It take time to understand the difference between formal implication
 and deduction. I have problem to explain the difference between
 computation and description of computation ...




 So, I figured, why not try to
 start my own?


 Why not?  Actually I have many questions in logic, but all are
 technical and long to explain. Some have been solved by Eric, who then
 raised new interesting question.

 Have you heard about the Curry Howard isomorphism? I have send posts
 on this list on the combinators, and one of the reason for that is
 that combinators can be used for explaining that CH correspondence
 which relates in an amazing way logic and computer science.

 Do you know Jean-Louis Krivine? A french logician who try to extend
 the CH (Curry Howard) isomorphism on classical logic and set theory. I
 am not entirely convinced by the details but I suspect something quite
 fundamental and important for the future of computer science and logic.
 You can take a look, some of its paper are in english.
 http://www.pps.jussieu.fr/~krivine/
 Jean-Louis Krivine wrote also my favorite book in set theory.
 The CH correspondence of the (classical) Pierce law as a comp look!

 Don't hesitate to send us link to anything relating computer science
 and logic (like the Curry-Howard isomorphism), because, although I
 doubt it can be used easily in our framework, in a direct way, it
 could have some impact in the future.  Category theory is a very nice
 subject too, but is a bit technically demanding at the start. Yet, it
 makes possible to link knot theory, quantum computation, number
 theory, gravity, ...
 Not yet consciousness, though. Intensional free mathematics still
 resist ...




 In fact, I originally joined this list hoping for a logic-oriented
 mailing list. I haven't been entirely disappointed there,

 You are kind!


 but at the
 same time that isn't what this list is really intended for.

 Logic is a very interesting field. Too bad it is not so well known by
 the large public. The everything list is more theory of everything
 oriented. Logic has a big role to play, (assuming comp) but physics,
 cognitive science and even theology can hardly be avoided in a truly
 unifying quest ... And we try to be as less technic as possible, which
 is for me very hard,  ... oscillating between UDA and AUDA.

 Best,

 Bruno



 http://iridia.ulb.ac.be/~marchal/




 




-- 
Abram Demski
http://dragonlogic-ai.blogspot.com/

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
Everything List group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en
-~--~~~~--~~--~--~---