----- Original Message -----
From: "Frank" <[EMAIL PROTECTED]>
> A few comments on your post.
> If I interpret correctly, you are basically distinguishing "dualistic"
> interpretations from a "materialistic" ones.
> When we talk of a materialistic viewpoint, what *are* we talking about?
> Is it our vague conception that everything is made of atoms what
> a materialistic view of the universe?
> As we all know, not even the deepest theoretical physicists know what the
> hell they are talking about, in a fundamental sense, when they talk about
> matter, energy, quarks, gravity, etc. They only describe the result of
> measurements and abstract mental models that somehow, accommodate or
> "shadow" the results of these measurements.
> Where does this leave dualism? If the "material" world is just a mental
> construct of man, created to accommodate our sensorial input, there is
> suddenly no more "dual" in "dualism", only experience, or whatever we want
> to call our sensorial life.
> So, one can hypothesize that there is no need to define a mysterious
> "material basis" for what is just sensorial experience.
Sounds like Schopenhaeur...
Even though I like that view to some extent, I am sure that the
great majority of professional physicists do work with the idea
that matter - the stuff that is measured and detected in their
experiments, is all that 'is'. And that although we have no idea how
to describe this phenomena of consciousness, it must
in the end be a byproduct of a material organ - the brain.
But before this discussion turns to "what do physicists think", let
me stress that what I want to argue is that a materialistic view is
inconsistent with QTI - never mind who actually defends a
materialistic position. Further, if we do not assume a materialistic
framework, there is still less reason to suppose QTI.
> If windows 98 where considered conscious AI, would a version
> of windows 98
> running on two different computers be one entity or two
I would say they would be two entities. One is not aware of the
other's "sensorial inputs", even if they would act the same given the
same inputs. Awareness is what matters here.
> Why would the universe create two souls, when one will suffice?
I didn't understand the question... You are considering the factual
existence of a soul, and associating it with the AI running win98?
If that is the case, I think I made a point that the AIs are two
entities, therefore they should have a soul of each's own.
> If we consider ourselves to be just a sequence of states in a mathematical
> universe (a fairly modest hypothesis), the only condition for us having a
> sense of identity from one state to the next is not necessarily to pertain
> to the same "material" substrate (which may not even exist), but that the
> two states be related by some continuity, or memory.
> After all, my personal "viewpoint" always prefers to stick with me instead
> of switching back and forth with my dog, since his states our not a
> continuation of mine, memory-wise.
> If this interpretation is correct, it can be argued that we'll never "be"
> the null state of death, because death is not a state which will remember
> any previous "me".
> So, it follows that if there exists a plausible state or configuration
> is a valid continuation, memory-wise, of my current state, then my
> "viewpoint", will prefer this path over the "death" state.
> ergo, immortality !.
> PD: definition of "viewpoint": an artificial construct to help visualize
> succession of states that constitute my identity,
Let me put forth some arguments:
I have seen a movie (can't remember the name, but if my
memory isn't wrong, I think the governor of California was in it...
Well, in this movie they made perfect clones of people and
copied their memories so that they could perfectly reproduce one's
mental state at the instant of the copying in the clone.
When the clone was "awakened", he/she felt just like that person who
died. For an individual who was seeing that clone, and even to the
clone himself, there was no way to know if that was or was not a
clone (except for a detail in the ear, or something, that was put to
make a certain story possible, but never mind) so if continuity of
memory is all that matters, you would say that the clone WAS the
original person. But in the movie a clone of the governor is made
BEFORE he is dead. So both have the true belief that they are the
'real' one. Even though it could sound fun to say that they both are,
it certainly does not make sense, since they are not aware of each
other's experiences, no matter how similar they look. If one dies,
he should find no comfort in knowing that his/her clone will survive.
Let us run that slowly to make sure: if the clone is made before one
dies, then one will not experience the clone's life. For all practical
purposes, he is NOT the clone. If the clone is made at the exact
same time of his death, (though simultaneity is a hard thing to define)
there should still be no reason to suppose that he will be aware of
the clone's mental states. Not even if the clone is made after his
death. So I think we can safely say that *we are not our clones*.
Now let us remember that what I am trying to find is the practical
answer to the question: "will we survive a severe car accident due
to QTI?" Any answer to that should tell what is the probability that
' I ' experience a posterior state. Not that the multiverse contains
some entity which looks like me and will experience it. Just like I
am not interested in knowing if there will be a clone of me with my
perfect memory reinstalled. I won't be aware of what it sees!
So in the multiverse clones are created all the time. At every moment
there are multitudes of clones of me doing different things. But I am
not aware of any of them. So, in the definition that ' I ' is what I
experience, when we talk about the multiverse, it is still true that
*I am not my clones*. I am not happy with the fact that many clones
are winning the lottery right now!
So the same with death. If there is nothing external to the material brain,
then death is just another state of that organ. If we die, it simply means
that that organ will have no more sensorial or mental experiences. No
reason for us to jump to the closest clone in order to have some
experience. If we are no more than the matter that the brain is composed
of, we should not leave that brain. Even though the brain is decomposing,
it is still there. It is just not aware of anything anymore.
To say that we MUST have an experience is a very strong
assumption which should not be justified for the materialist. It is no
surprise to be unconscious. People are unconscious when they are
anesthetized, as someone here pointed out. Death is then just