Re: Consciousness is information?
Hi Bruno On May 19, 7:37 pm, Bruno Marchal marc...@ulb.ac.be wrote: ... UDA is an argument showing that the current paradigmatic chain MATTER = CONSCIOUSNESS = NUMBER is reversed: with comp I can explain too you in details (it is long) that the chain should be NUMBER = CONSCIOUSNESS = MATTER. Some agree already that it could be NUMBER = MATTER = CONSCIOUSNESS, and this indeed is more locally obvious, yet I pretend that comp forces eventually the complete reversal. Do you have any reference where this is developed? I try to be as close to facts as possible, and the most plausible explanation for me, trough natural selection, is that consciousness is a processing device made by natural selection as an adaptation to the physical environment, social environment included. So I support matter- consciousness. Dualism is the result of my subjective experience, and my subjective experience is the most objective fact that I can reach. I cannot support this Kantian notion consciousness - matter. The final words that I can say about the hard problem of consciousness is that any conversation with a robot, with the self- module that I described in the previous post, will give answers about qualia indistinguisable from the answers of any of you. He would indeed doubt about if you are indeed robots and he is the only conscious being on earth. Just as any of you may think. Its self module would not say I perceive the green as green because he has this as an standard answer, like a fake Turing test program, but because it can zoom in the details of every leaf, grass etc and verify that the range of ligh frecuencies are in the range of frequencias that a computer programmer assigned to green and a trainer later told him to call it green. He even can have its own philosophical theories about qualia, the self etc. He even may ask himself about the origins of moral and self determination, and even all of this may force him to believe in God. So we must conclude that he have its own qualia and all the attributes of consciousness. in no less degree than I could believe in yours. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups Everything List group. To post to this group, send email to everything-list@googlegroups.com To unsubscribe from this group, send email to everything-list+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/everything-list?hl=en -~--~~~~--~~--~--~---
Re: logic mailing list
On 20 May 2009, at 00:01, John Mikes wrote: As always, thanks, Bruno for taking the time to educate this bum. Starting at the bottom: To ask a logician the meaning of the signs, (...) is like asking the logician what is logic, and no two logicians can agree on the possible answer to that question. This is why I asked -- YOUR -- version. * Logic is also hard to explain to the layman,... I had a didactically gifted boss (1951) who said 'if you understand something to a sufficient depth, you can explain it to any avarage educated person'. And here comes my counter-example to your AB parable: condition: I have $100 in my purse. 'A' means I take out $55 from my purse and it is true. 'B' means: I take out $65 from my purse - and this is also true. AB is untrue (unless we forget about the meaning of or and . In any language. As I said you are a beginner. And you confirm my theory that beginner can be great genius! You have just discovered here the field of linear logic. Unfortunately the discovery has already been done by Jean-Yves Girard, a french logician. Your money example is often used by Jean-Yves Girard himself to motivate Linear logic. Actually my other motivation for explaining the combinators, besides to exploit the Curry Howard isomorphism, was to have a very finely grained notion of deduction so as to provide a simple introduction to linear logic. In linear logic the rule of deduction are such that the proposition A and the proposition A A are not equivalent. Intuitionistic logic can be regain by adding a modal operator, noted ! and read of course A, and !A means A A A ... Now, a presentation of a logic can be long and boring, and I will not do it now because it is a bit out of topic. After all I was trying to explain to Abram why we try to avoid logic as much as possible in this list. But yes, in classical logic you can use the rule which says that if you have prove A then you can deduce A A. For example you can deduce, from 1+1 = 2, the proposition 1+1=2 1+1=2. And indeed such rules are not among the rule of linear logic. Linear logic is a wonderful quickly expanding field with many applications in computer science (for quasi obvious reason), but also in knot theory, category theory etc. The fact that you invoke a counterexample shows that you have an idea of what (classical) logic is. But it is not a counter example, you are just pointing to the fact that there are many different logics, and indeed there are many different logics. Now, just to reason about those logics, it is nice to choose one logic, and the most common one is classical logic. Logician are just scientist and they give always the precise axiom and rule of the logic they are using or talking about. A difficulty comes from the fact that we can study a logic with that same logic, and this can easily introduce confusion of levels. * I think you are pointing the finger on the real difficulty of logic for beginners How else do I begin than a beginner? to learn signs without meaning, then later on develop the rules to make a meaning? My innate common sense refuses to learn anything without meaning. Rules, or not rules. I am just that kind of a revolutionist. I think everybody agree, but in logic the notion of meaning is also studied, and so you have to abstract from the intuitive meaning to study the mathematical meaning. Again this needs training. Finally, (to begin with) ...study of the laws of thought, although I would add probability theory to it ...??? I discard probability as a count - consideration inside a limited (cut) model, 'count' - also callable: statistics, strictly limited to the given model- content of the counting - with a notion (developed in same model) what, or how many the next simialr items MAY be - for which there is no anticipation in the stated circumstances. To anticipate a probability one needs a lot of additional knowledge (and its critique) and it is still applicable only within the said limited model-content. Change the boundaries of the model, the content, the statistics and probability will change as well. Even the causality circumstances (so elusive in my views). I am afraid you are confirming my other theory according to which great genius can tell great stupidities (with all my respect of course grin). Come on John, there are enough real difficulties in what I try to convey that coming back on a critic of the notion of probability is a bit far stretched. Einstein discovered the atoms with the Brownian motion by using Boltzmann classical physical statistics. I have heard that Boltzman killed himself due to the incomprehension of his contemporaries in front of that fundamental idea (judged obvious today). But today there is no more conceptual problem with most use of statistics 'except when used by politicians!).
Re: Consciousness is information?
Hi Alberto, On 20 May 2009, at 13:08, Alberto G.Corona wrote: On May 19, 7:37 pm, Bruno Marchal marc...@ulb.ac.be wrote: ... UDA is an argument showing that the current paradigmatic chain MATTER = CONSCIOUSNESS = NUMBER is reversed: with comp I can explain too you in details (it is long) that the chain should be NUMBER = CONSCIOUSNESS = MATTER. Some agree already that it could be NUMBER = MATTER = CONSCIOUSNESS, and this indeed is more locally obvious, yet I pretend that comp forces eventually the complete reversal. Do you have any reference where this is developed? I have often explain UDA on this list. There is a very older version in 15 steps, and a more recent in 8 steps. You could search in the archive of this list. Or look at my Sane04 paper: http://iridia.ulb.ac.be/~marchal/publications/SANE2004MARCHALAbstract.html You can print the slides. I refer now often to UDA-i with i from 0 to 8, which are the main step of the reasoning. PDF slide UDA is for Universal Dovetailer Argument. The UD provides a concrete base for a reasoning in line with the everything or many worlder open minded philosophy common on this list, especially for the relativist one (where proba are always conditional). UDA is provably available to Universal (in the theoretical computer science sense of Post, Turing, Kleene, Church, ... machine, which leads to a machine version of UDA: AUDA (Arithmetical UDA). UDA is mainly an argument showing that, assuming comp, the mind body problem reduce to the body problem. And AUDA shows a natural path to extract the solution of the body problem by that interview of the universal machine. Much older versions are in French (my PhD actually, and more older paper). See my URL. I try to be as close to facts as possible, and the most plausible explanation for me, trough natural selection, is that consciousness is a processing device made by natural selection as an adaptation to the physical environment, social environment included. This is plausible for most of the human and animal part of consciousness. It is a reasonable local description. But globally a dual version of this has the advantage of explaining how nature itself evolves, from sort of competition and selection of pieces of machine dreams, which are easy to define in arithmetic (assuming comp ...). It is normal that comp depends on the many non trivial results in computer science. A universal machine is itself a rather non obvious notion. So I support matter- consciousness. I could explain why it has to look locally that way, but it can not work in the big picture, unless you make both matter and mind, not just infinite, but very highly infinite ... (just read UDA, I think I have make progress through those explanation on the list). Dualism is the result of my subjective experience, I doubt this can be. I would say it is a result of your experience together with a bet (instinctive or/and rational) in a independent reality. you cannot experience the independent reality. You can experience only the dependent reality, but not as a dependent one, for this you need to bet on the independent one. What makes this diificult is that we make that bet instinctively since birth and beyond. and my subjective experience is the most objective fact that I can reach. I see what you mean, but the subjective experience, although real and true, and undoubtable, is subjective. It exists as far as you cannot prove to an other that it exists. To communicate you have to bet on tools and on others, and other many doubtable (yet plausible) mind constructions. I cannot support this Kantian notion consciousness - matter. The problem is that if you are ready to attribute consciousness to a device, by its virtue of simulating digitally a conscious brain at some correct level of description, you will be forced to attribute that consciousness to an infinity of computations already defined by the additive and multiplicative structure of the numbers (by UDA). A quasi direct consequence is that if a machine look at herself below its substitution level, it will build indirect evidences of a flux of many (a continuum) of computational histories (a typical quantum feature, I mean for QM without wave collapse). But comp forces the structure of those many realities (or dreams) to be determined by specifiable number theoretical relations. Those relations are either extensional relations (like in number theory), or intensional relations (like in computer science, where number can also points toward other numbers, and effective set of numbers). It makes computationalism testable. The genral shape of QM confirm it, but cosmogenesis remains troubling ... The final words that I can say about the hard problem of consciousness is that any conversation with a robot, with the self- module that I described in the previous
Re: logic mailing list
Bruno, I knew already about combinators, and the basic correspondence between arrow-types and material conditionals. If I recall, pairs correspond to , right? I do not yet understand about adding quantifiers and negation. Still, I do not really see the usefulness of this. It is occasionally invoked to justify the motto programs are proofs, but it doesn't seem like it does any such thing. --Abram On Tue, May 19, 2009 at 11:25 AM, Bruno Marchal marc...@ulb.ac.be wrote: Hi Abram, On 18 May 2009, at 21:53, Abram Demski wrote: Bruno, I know just a little about the curry-howard isomorphism... I looked into it somewhat, because I was thinking about the possibility of representing programs as proof methods (so that a single run of the program would correspond to a proof about the relationship between the input and the output). But, it seems that the curry-howard relationship between programs and proofs is much different than what I was thinking of. Let me give the shorter and simple example. Do you know the combinators? I have explained some time ago on this list that you can code all programs in the SK combinator language. The alphabet of the language is (, ), K S. Variables and equality sign = are used at the metalevel and does not appear in the program language. The syntax is given by the (meta)definition: K is a combinator S is a combinator if x and y are combinators then (x y) is a combinator. The idea is that all combinators represent a function of one argument, from the set of all combinators in the set of all combinators, and ( x y) represents the result of applying x to y. To increase readability the left parenthesis are not written, so ab(cd)e is put for a b) (c d)) e) So example of combinators are: K, S, KK, KS, SK, SS, KKK, K(KK), etc. Remember that KKK is ((KK)K). The (meta)axioms (or the scheme of axioms, with x and y being any combinators) are Kxy = x Sxyz = xz(yz) If you give not the right number of argument, the combinators give the starting expression (automated currying): so SK gives SK, for example. But KKK gives K, and SKSK gives KK(SK) which gives K. OK? The inference rule of the system are simply the equality rule: from x = y you can deduce y = x, and from x = y and y = z you can deduce x = z, together with: from x = y you can deduce xz = yz, and, from x = y you can deduce zx = zy. This gives already a very powerful theory in which you can prove all Sigma_sentences (or equivalent). It defines a universal dovetailer, and adding some induction axioms gives a theory at least as powerful as Peano Arithmetic. See my Elsevier paper Theoretical computer science and the natural sciences for a bit more. Or see http://www.mail-archive.com/everything-list@googlegroups.com/msg05920.html and around. The Curry Howard isomorphism arrives when you introduce types on the combinators. Imagine that x is of type a and y is of type b, so that a combinator which would transform x into y would be of type a - b. What is the type of K? (assuming again that x if of type a and y is of type b). You see that Kx on y gives x, so K take an object of type a, (x), and transforms it into an object (Kx) which transform y in x, so K takes an object of type a, and transform it into an object of type (b - a), so K is of type a - (b - a) And you recognize the well known a fortiori axioms of classical (and intuitionistic) logic. If you proceed in the same way for S, you will see that S is of type (a - (b - c)) - ((a - b) - (a - c)) And you recognize the arrow transitivity axiom, again a classical logic tautology (and a well accepted intuistionistic formula). So you see that typing combinators gives propositional formula. But something else happens: if you take a combinator, for example the simplest one, I, which compute the identity function Ix = x. It is not to difficult to program I with S and K, you will find SKK (SKKx = Kx(Kx) = x). Now the step which leads to the program SKK, when typed, will give the (usual) proof of the tautology a - a from the a fortiori axiom and the transitivity axiom. The rules works very well for intuitionistic logic associating type to logical formula, and proof to programs. The application rule of combinators correspond to the modus ponens rule, and the deduction theorem correspond to lambda abstraction. It leads thus to a non trivial and unexpected isomorphism between programming and proving. For a long time this isomorphism was thought applying only to intuitionistic logic, but today we know it extends on the whole of classical logic and classical theories like set theory. Typical classical rule like Pierce axioms ((a - b) - a) - a) gives try-catch error handling procedure, and Krivine seems to think that Gödel's theorem leads to decompiler (but this I do not yet understand!). This gives constructive or partially constructive interpretation of logical formula and theorems, and this is a rather amazing
Re: logic mailing list
Bruno, I cheerfuly accept both of your notations about a genius. Everybody is one, just some boast about it, others are ashamed. I just accept. I feel what you call classical logic is my 'common sense' (restricted of the ways how the average person thinks). Linear logic (Sorry, Jean-Yves Girard, never heard your name) is not my beef: in my expanded totality vue nothing can be linear. We 'think' in a reductionist way - in models, i.e. in limited topical cuts from the totality, becuse our mental capabilities disallow more - I think pretty linearly. I just try to attempt a wider way of consideration (I did not say: successfully). In such the real 'everything' is present, in unlimited relations into/with all we think of - without us noticing or even knowing about it. (Some we don't even know about). We just follow the given axioms (see below) of the in-model content and stay limitedly. When Gerolamo Cardano screwed up the term* 'probability* - as the first one applying a scientific calculability in his De Ludo Aleae he poisined the minds by the concept of a - mathematically applicable - homogenous distribution-based probability (later: *random,* the reason why the contemporaries of Boltzman could not understand him - before Einstein.) Alas, distributions are not homogenous and random does not exist in our deterministic (ordered) world (only ignorance about the 'how'-s) *Statistical* as well are the 'given' distributional counts within the chosen model- domain. *Math (applied)* was seeking the calculable, so it was restricted to the ordered disorder. If something is fundamentally impredicative (like the final value of pi) I am thinking of a 'fundamental' ignorance about the conditions of the description.(cf: 2-slit phenomenon). *AXIOMS, however, are products of a reversed logic:* they are devised in order to make our theories applicable and not vice versa. My point: with a different logic, different axioms may be devised and our explanations of the world may be quite different. E.g. 2+2 is NOT 4. You may call it 'bad' logic, Allowed. What I won't allow is *illogical *unless you checked ALL (possible and impossible) logical systems. Reading your enlightening remarks (thank you) I see that I don't need those 'signs' to NOT understand, you did not apply them and I did not understand your explanatory - lettered and numbered - par. (Why are 'idem per idem' * not* identical, (as A = A A) when naming 1+1=2 as A, - from 1+1=2, the format 1+1=2 1+1=2 is deducible? (Of course I don't know the meaning of 'deducible'.) You also sneaked in the word 'modal' operator, for which I am too much of a beginner. That much said: I ask your patience concerning my ignorance in my questions/remarks on what I think I sort of understood. I may be 'on the other side'. Best regards John On Wed, May 20, 2009 at 10:43 AM, Bruno Marchal marc...@ulb.ac.be wrote: On 20 May 2009, at 00:01, John Mikes wrote: As always, thanks, Bruno for taking the time to educate this bum. Starting at the bottom: To ask a logician the meaning of the signs, (...) is like asking the logician what is logic, and no two logicians can agree on the possible answer to that question. This is why I asked -- YOUR -- version. * Logic is also hard to explain to the layman,... I had a didactically gifted boss (1951) who said 'if you understand something to a sufficient depth, you can explain it to any avarage educated person'. And here comes my counter-example to your AB parable: condition: I have $100 in my purse. 'A' means I take out $55 from my purse and it is true. 'B' means: I take out $65 from my purse - and this is also true. AB is untrue (unless we forget about the meaning of or and . In any language. As I said you are a beginner. And you confirm my theory that beginner can be great genius! You have just discovered here the field of linear logic. Unfortunately the discovery has already been done by Jean-Yves Girard, a french logician. Your money example is often used by Jean-Yves Girard himself to motivate Linear logic. Actually my other motivation for explaining the combinators, besides to exploit the Curry Howard isomorphism, was to have a very finely grained notion of deduction so as to provide a simple introduction to linear logic. In linear logic the rule of deduction are such that the proposition A and the proposition A A are not equivalent. Intuitionistic logic can be regain by adding a modal operator, noted ! and read of course A, and !A means A A A ... Now, a presentation of a logic can be long and boring, and I will not do it now because it is a bit out of topic. After all I was trying to explain to Abram why we try to avoid logic as much as possible in this list. But yes, in classical logic you can use the rule which says that if you have prove A then you can deduce A A. For example you can deduce, from 1+1 = 2, the proposition 1+1=2 1+1=2. And indeed such rules are