Stathis:
you see, we go on with newer assumptions.
You ask:
"...why did we not evolve to be zombie animals?"
Some of us did.

I believe the other animals are very much of the same built as this one
(Homo) with a rather quantitative difference (which, of course, turns into
qualitative - V.A.Lenin) due to the interactive connectivity of different
orders of magnitude of neurons.  "Zombie" is one of those
discussion-promoters I pointed to as 'assumed nonsense'.
Is digital vs. biological computing so different? maybe simpler. I think you
are implying digital as the programmed and input-restricted embryonic level
of our PC etc., while the biologic refers to unlimited connectivity to
totality, including (beyond model) input in toto.
I find such a comparison unfair. (My opinion).
Every animal has 'experience' and 'memory' (whatever these terms are
meaning) according to the level of its functional complexity. Even a hydra
is learning. "WE" cannot explain with our sophistication how simpler
organisms work, including the alleged puzzling 'collective consciousness' of
social insects.

John M



----- Original Message -----
From: "Stathis Papaioannou" <[EMAIL PROTECTED]>
To: <everything-list@googlegroups.com>
Sent: Saturday, October 07, 2006 7:50 PM
Subject: RE: Maudlin's Demon (Argument)



John Mikes writes:

> Stathis, your post is 'logical', 'professional', 'smart', - good.
> It shows why we have so many posts on this list and why we get nowhere.
> You handle an assumption (robot) - its qualia, characteristics, make up a
> "thought-situation" and ASK about its annexed details. Now, your style is
> such that one cannot just disregard the irrelevance. So someone (many, me
> included<G>) respond with similar mindtwists  and it goes on and on. \
> Have you ever ps-analyzed a robot? Professionally, I mean.
> If it is a simple digital computer, it certainly has a memory, the one
fixed
> into chips as this PC I am using. Your and MY memory is quite different, I
> wish somebody could tell me acceptably, HOW???, but it is plastic,
> approximate, mixed with emotional changes, short and in cases false. I
would
> throw out a robot with such memory.

I did put in parentheses "this of course assumes a robot can have
experiences".
We can't know that this is so, but it seems a reasonable assumption to me.
If we
had evolution with digital processors rather than biological processors do
you think
it would have been possible for animals with similar behaviours to those
with which
we are familiar to have developed? If so, do you think these animals would
not
really have "experiences" despite behaving as if they did? Since evolution
can only
work on behaviour, if zombie animals were possible why did we not evolve to
be
zombie animals?

Stathis Papaioannou

> John,
>
> I should have been more precise with the terms "copy" and "emulate".
> What I was asking is whether a robot which experiences something while
> it is shovelling coal (this of course assumes that a robot can have
> experiences)
> would experience the same thing if it were fed input to all its sensors
> exactly
> the same as if it were doing its job normally, such that it was not aware
> the
> inputs were in fact a sham. It seems to me that if the answer is "no" the
> robot
> would need to have some mysterious extra-computational knowledge of the
> world, which I find very difficult to conceptualise if we are talking
about
> a standard
> digital computer. It is easier to conceptualise that such
non-computational
> effects
> may be at play in a biological brain, which would then be an argument
> against
> computationalism.
>
> Stathis Papaioannou
>
> > Stathis:
> > let me skip the quoted texts and ask a particular question.
> > ----- Original Message -----
> > From: "Stathis Papaioannou" <[EMAIL PROTECTED]>
> > Sent: Wednesday, October 04, 2006 11:41 PM
> > Subject: RE: Maudlin's Demon (Argument)
> > You wrote:
> > Do you believe it is possible to copy a particular consciousness by
> > emulating it, along
> > with sham inputs (i.e. in virtual reality), on a general purpose
computer?
> > Or do you believe
> > a coal-shovelling robot could only have the coal-shovelling experience
by
> > actually shovelling
> > coal?
> >
> > Stathis Papaioannou
> > ---------------------------------
> > My question is about 'copy' and 'emulate'.
> >
> > Are we considering 'copying' the model and its content (in which case
the
> > coal shoveling robot last sentence applies) or do we include the
> > interconnections unlimited in "experience", beyond the particular model
we
> > talk about?
> > If we go "all the way" and include all input from the unlimited totality
> > that may 'format' or 'complete' the model-experience, then we re-create
> the
> > 'real thing' and it is not a copy. If we restrict our copying to the
> aspect
> > in question (model) then we copy only that aspect and should not draw
> > conclusions on the total.
> >
> > Can we 'emulate' totality? I don't think so. Can we copy the total,
> > unlimited wholeness? I don't think so.
> > What I feel is a restriction to "think" within a model and draw
> conclusions
> > from it towards beyond it.
> > Which looks to me like a category-mistake.
> >
> > John Mikes


--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/everything-list
-~----------~----~----~----~------~----~------~--~---

Reply via email to