RE: Belief Statements
On 28 Jan 2005 Hal Finney wrote: Here's how I look at the question of whether a bit string, if accidentally implemented as part of another program, would be conscious. . . . I would approach this from the Schmidhuber perspective that all programs exist and run, in a Platonic sense, and this creates all computable universes. Some programs create universes like ours, which have conscious entities. Other programs create random universes, which may, through sheer outlandish luck, instantiate patterns which match those of conscious entities. All consciousnesses exist in this model, and as Bruno emphasizes, from the inside there is no way to know which program instantiated you. In fact this may not even be a meaningful question. But what are meaningful to ask, in the Schmidhuber sense, are two things. First, what is the measure of your consciousness: how likely are you to exist? And second, among all of the instantiations of your consciousness in all the universes, how much of your measure does each one contribute? All well so far. I suggest that the answer is that accidental instantiations only contribute an infinitesimal amount, compared to the contributions of universes like ours. Our universe appears to have extremely simple physical laws and initial conditions. Yet it formed complex matter and chemistry which allowed life to evolve and consciousness to develop. Maybe we got some lucky breaks; the universe doesn't seem particularly fecund as far as we can tell, but conscious life did happen. The odds against it were not, as in the case of accidental instantiation, an exponential of an astronomical number. This means that the contribution to a consciousness from a lawful universe like the one we observe is almost infinitely greater than the contribution from accidental instantiations. I don't understand this conclusion. A lengthy piece of code (whether it represents a moment of consciousness or anything else) is certainly less likely to be accidentally implemented on some random computer than on the computer running the original software. But surely the opposite is the case if you allow that all possible computer programs run simply by virtue of their existence as mathematical objects. For every program running on a biological or electronic computer, there must be infinitely many exact analogues and every minor and major variation thereof running out there in Platonia. --Stathis Papaioannou _ Find love today with ninemsn personals. Click here: http://ninemsn.match.com?referrer=hotmailtagline
RE: Belief Statements
I recently posted that I seemed to have two theories re how my multiverse might work. These are: 1) Nothing - Something = to completion. 2) {Nothing#(n) + All[(n-1) = evolving Somethings]} - {Nothing#(n+1) + All[n = evolving Somethings]} : repeat... Here: - is a spontaneous decay of a Nothing into a Something because of the inherent logical incompleteness of the Nothing. = is a random path. = is a path where each new step is inconsistent with prior steps. In (1) choice within the Something is a necessary component of the =. In (2) choice is precluded to avoid accumulation of net information. My issue is that it seems one would like to base an explanation of how worlds evolve on the presence of choice. However, since the [Nothing,All] is a definitional pair, how does one justify selecting (1) over (2)? In my opinion choice demands a non quantified time - that is a continuous flow in a = and there must be steps in a =. Hal Ruhl
RE: Belief Statements
On 28 Jan 2005 Hal Finney wrote: I suggest that the answer is that accidental instantiations only contribute an infinitesimal amount, compared to the contributions of universes like ours. Stathis Papaioannou replied: I don't understand this conclusion. A lengthy piece of code (whether it represents a moment of consciousness or anything else) is certainly less likely to be accidentally implemented on some random computer than on the computer running the original software. But surely the opposite is the case if you allow that all possible computer programs run simply by virtue of their existence as mathematical objects. For every program running on a biological or electronic computer, there must be infinitely many exact analogues and every minor and major variation thereof running out there in Platonia. I'm afraid I don't understand your argument here. I am using the Schmidhuber concept that the measure of a program is related to its size and/or information complexity: that shorter (and simpler) programs have greater measure than longer ones. Do you agree with that, or are you challenging that view? My point was then that we can imagine a short program that can naturally evolve consciousness, whereas to create consciousness artificially or arbitrarily, without a course of natural evolution, requires a huge number of bits to specify the conscious entity in its entirety. You mention infinity; are you saying that there is no meaningful difference between the measure of programs, because each one has an infinite number of analogs? Could you explain that concept in more detail? Hal Finney
RE: Belief Statements
I meant to define the symbol = as: = is a path over kernels where each new step is inconsistent with prior steps. Hal Ruhl
Re: Belief Statements
Dear Hal, What your defining seems to me to be a NOT map or else it is a mere random map. There is no consistent definition of an inconsistent map otherwise, IMHO. Please explain how I am wrong. ;-) Why not a map that is a path where the information associated with each step is consistent to some degree /delta with the information available about the prior steps? Stephen - Original Message - From: Hal Ruhl [EMAIL PROTECTED] To: everything-list@eskimo.com Sent: Saturday, January 29, 2005 3:43 PM Subject: RE: Belief Statements I meant to define the symbol = as: = is a path over kernels where each new step is inconsistent with prior steps. Hal Ruhl
Re: Belief Statements
At 06:29 PM 1/29/2005, you wrote: Dear Hal, What your defining seems to me to be a NOT map or else it is a mere random map. There is no consistent definition of an inconsistent map otherwise, IMHO. Please explain how I am wrong. ;-) I wanted to have a sequence that does not accumulate net information or have an rule that is itself net information. A random sequence has to check to see if its pattern fits some test for randomness. A path wherein each step is inconsistent with the past sequence seems to meet the requirements I desired. Why not a map that is a path where the information associated with each step is consistent to some degree /delta with the information available about the prior steps? In my opinion any such rule is net information. Hal Ruhl
Re: Belief Statements
Dear Hal, What do you propose as a means to explain the memory and processing required to be sure of inconsistency as opposed to consistency? Both options, it seems to me, require checking of some kind! All that is left is randomness, there is no such a thing as a true test for randomness that is finitely implementable! If we accept that option then we have to explain the apparent continuity that occurs in the 1st person aspect of the path. Stephen - Original Message - From: Hal Ruhl [EMAIL PROTECTED] To: everything-list@eskimo.com Sent: Saturday, January 29, 2005 7:17 PM Subject: Re: Belief Statements At 06:29 PM 1/29/2005, you wrote: Dear Hal, What your defining seems to me to be a NOT map or else it is a mere random map. There is no consistent definition of an inconsistent map otherwise, IMHO. Please explain how I am wrong. ;-) I wanted to have a sequence that does not accumulate net information or have an rule that is itself net information. A random sequence has to check to see if its pattern fits some test for randomness. A path wherein each step is inconsistent with the past sequence seems to meet the requirements I desired. Why not a map that is a path where the information associated with each step is consistent to some degree /delta with the information available about the prior steps? In my opinion any such rule is net information. Hal Ruhl
Re: Belief Statements
Hi Stephen: At 10:49 PM 1/29/2005, you wrote: Dear Hal, What do you propose as a means to explain the memory and processing required to be sure of inconsistency as opposed to consistency? It is not a logical inconsistency. What I am trying to convey is that each step in the sequence pays no attention to the prior sequence. That is a maximal inconsistency of progression to the sequence. Random and independent to me convey a testable behavior and I want to point to an untestable progression. Both options, it seems to me, require checking of some kind! All that is left is randomness, there is no such a thing as a true test for randomness that is finitely implementable! The embedding system component - the All - is already infinite, so an infinite test is containable therein. If we accept that option then we have to explain the apparent continuity that occurs in the 1st person aspect of the path. Such a path will link arbitrarily long strings of kernels that give the appearance of 1st person continuity, and this appearance can hold even if many other kinds of kernels intervene - the 1st person could not detect this. Hal Ruhl