Peter Jones writes:
> Stathis Papaioannou wrote:
> > Pete Carlton writes:
> >
> > > On Dec 8, 2006, at 7:48 AM, Bruno Marchal wrote:
> > >
> > > > Then I am not sure if this is really related with Quentin Anciaux's
> > > > idea that he feels located in his head.
> > > > The idea that we are in o
Stathis Papaioannou wrote:
> Pete Carlton writes:
>
> > On Dec 8, 2006, at 7:48 AM, Bruno Marchal wrote:
> >
> > > Then I am not sure if this is really related with Quentin Anciaux's
> > > idea that he feels located in his head.
> > > The idea that we are in our head ... is in our head!
> > >
> >
Pete Carlton writes:
> On Dec 8, 2006, at 7:48 AM, Bruno Marchal wrote:
>
> > Then I am not sure if this is really related with Quentin Anciaux's
> > idea that he feels located in his head.
> > The idea that we are in our head ... is in our head!
> >
>
> Another way of saying that, is that we
On Dec 8, 2006, at 7:48 AM, Bruno Marchal wrote:
> This is indeed an excellent text (it is also in the book "Mind's I").
> Definitive? I doubt it. Dennett miss there the first person
> indeterminacy, although he get close ...
>
You're right of course, I should have used a different adjective,
Le 08-déc.-06, à 02:33, Pete Carlton a écrit :
>
> A definitive treatment of this problem is Daniel Dennett's story
> "Where am I?"
> http://www.newbanner.com/SecHumSCM/WhereAmI.html
This is indeed an excellent text (it is also in the book "Mind's I").
Definitive? I doubt it. Dennett miss ther
A definitive treatment of this problem is Daniel Dennett's story
"Where am I?"
http://www.newbanner.com/SecHumSCM/WhereAmI.html
On Dec 6, 2006, at 4:06 PM, Brent Meeker wrote:
>
> Quentin Anciaux wrote:
>> Le Mercredi 6 Décembre 2006 19:35, Brent Meeker a écrit :
>>> Quentin Anciaux wrote:
>>>
Stathis Papaioannou wrote:
>
> Brent Meeker writes:
>
>>> You're implying that the default assumption should be that
>>> consciousness correlates more closely with external behaviour
>>> than with internal activity generating the behaviour: the tape
>>> recorder should reason that as the CD pla
Brent Meeker writes:
> > You're implying that the default assumption should be that
> > consciousness correlates more closely with external behaviour than
> > with internal activity generating the behaviour: the tape recorder
> > should reason that as the CD player produces the same audio output
Quentin Anciaux wrote:
> Le Mercredi 6 Décembre 2006 19:35, Brent Meeker a écrit :
>> Quentin Anciaux wrote:
>> ...
>>
>>> Another thing that puzzles me is that consciousness should be generated
>>> by physical (and chemicals which is also "physical") activities of the
>>> brain, yet I feel my con
On Wed, Dec 06, 2006 at 11:38:32PM +0100, Quentin Anciaux wrote:
>
> Le Mercredi 6 Décembre 2006 19:35, Brent Meeker a écrit :
> > Quentin Anciaux wrote:
> > ...
> >
> > > Another thing that puzzles me is that consciousness should be generated
> > > by physical (and chemicals which is also "physi
Le Mercredi 6 Décembre 2006 19:35, Brent Meeker a écrit :
> Quentin Anciaux wrote:
> ...
>
> > Another thing that puzzles me is that consciousness should be generated
> > by physical (and chemicals which is also "physical") activities of the
> > brain, yet I feel my consciousness (in fact me) is l
Hi Quentin,
>
> Hi Stathis,
>
> Le Mercredi 6 Décembre 2006 10:23, Stathis Papaioannou a écrit :
> > Brent meeker writes:
> > > Stathis Papaioannou wrote:
> > > "Fair" is a vague term. That they are the same would be my default
> > > assumption, absent any other information. Of course knowing
Stathis Papaioannou wrote:
>
> Brent meeker writes:
>
>> Stathis Papaioannou wrote:
>>> Brent Meeker writes:
>>>
> I assume that there is some copy of me possible which
> preserves my 1st person experience. After all, physical
> copying literally occurs in the course of normal life
Quentin Anciaux wrote:
...
> Another thing that puzzles me is that consciousness should be generated by
> physical (and chemicals which is also "physical") activities of the brain,
> yet I feel my consciousness (in fact me) is located in the upper front of my
> skull... Why then neurons located
Hi Stathis,
Le Mercredi 6 Décembre 2006 10:23, Stathis Papaioannou a écrit :
> Brent meeker writes:
> > Stathis Papaioannou wrote:
> > "Fair" is a vague term. That they are the same would be my default
> > assumption, absent any other information. Of course knowing that one is
> > analog and th
Brent meeker writes:
> Stathis Papaioannou wrote:
> >
> > Brent Meeker writes:
> >
> >>> I assume that there is some copy of me possible which preserves
> >>> my 1st person experience. After all, physical copying literally
> >>> occurs in the course of normal life and I still feel myself to be
Stathis Papaioannou wrote:
>
> Brent Meeker writes:
>
>>> I assume that there is some copy of me possible which preserves
>>> my 1st person experience. After all, physical copying literally
>>> occurs in the course of normal life and I still feel myself to be
>>> the same person. But suppose I a
Brent Meeker writes:
> > I assume that there is some copy of me possible which preserves my
> > 1st person experience. After all, physical copying literally occurs
> > in the course of normal life and I still feel myself to be the same
> > person. But suppose I am offered some artificial means o
Stathis Papaioannou wrote:
...
> I assume that there is some copy of me possible which preserves my
> 1st person experience. After all, physical copying literally occurs
> in the course of normal life and I still feel myself to be the same
> person. But suppose I am offered some artificial means
Bruno Marchal writes:
> >> Well, in the case comp will be refuted (for example by predicting that
> >> electrons weigh one ton, or by predicting non eliminable white
> >> rabbits)
> >> , then everyone will be able to guess that those people were
> >> committing
> >> suicide. The problem is tha
Le 05-déc.-06, à 00:31, Stathis Papaioannou a écrit :
>> Well, in the case comp will be refuted (for example by predicting that
>> electrons weigh one ton, or by predicting non eliminable white
>> rabbits)
>> , then everyone will be able to guess that those people were
>> committing
>> suicid
Bruno Marchal writes:
> Le 02-déc.-06, à 06:11, Stathis Papaioannou a écrit :
>
> > In addition to spectrum reversal type situations, where no change is
> > noted from
> > either 3rd or 1st person perspective (and therefore it doesn't really
> > matter to anyone:
> > as you say, it may be occ
Le 02-déc.-06, à 06:11, Stathis Papaioannou a écrit :
> In addition to spectrum reversal type situations, where no change is
> noted from
> either 3rd or 1st person perspective (and therefore it doesn't really
> matter to anyone:
> as you say, it may be occurring all the time anyway and we wou
Le 01-déc.-06, à 20:05, Brent Meeker a écrit :
>
> Bruno Marchal wrote:
>>
>> Le 01-déc.-06, à 10:24, Stathis Papaioannou a écrit :
>>
>>>
>>> Bruno Marchal writes:
>>>
> We can assume that the structural difference makes a difference to
> consciousness but
> not external
Stathis Papaioannou wrote:
>
> Brent meeker writes:
>
>>> I don't doubt that there is some substitution level that preserves 3rd
>>> person
>>> behaviour and 1st person experience, even if this turns out to mean copying
>>> a person to the same engineering tolerances as nature has specified
Brent meeker writes:
> > I don't doubt that there is some substitution level that preserves 3rd
> > person
> > behaviour and 1st person experience, even if this turns out to mean copying
> > a person to the same engineering tolerances as nature has specified for
> > ordinary
> > day to day
ns of people might agree
to have these
replacement brains and no-one will ever know that they are committing suicide.
Stathis Papaioannou
> From: [EMAIL PROTECTED]
> Subject: Re: UDA revisited and then some
> Date: Fri, 1 Dec 2006 12:27:37 +0100
&g
Bruno Marchal wrote:
>
> Le 01-déc.-06, à 10:24, Stathis Papaioannou a écrit :
>
>>
>> Bruno Marchal writes:
>>
>>>
>>>
We can assume that the structural difference makes a difference to
consciousness but
not external behaviour. For example, it may cause spectrum reversal.
>>>
>>
Stathis Papaioannou wrote:
>
> Bruno Marchal writes:
>
>>
>>
>>> We can assume that the structural difference makes a difference to
>>> consciousness but
>>> not external behaviour. For example, it may cause spectrum reversal.
>>
>> Let us suppose you are right. This would mean that there is
Le 01-déc.-06, à 10:24, Stathis Papaioannou a écrit :
>
>
> Bruno Marchal writes:
>
>>
>>
>>> We can assume that the structural difference makes a difference to
>>> consciousness but
>>> not external behaviour. For example, it may cause spectrum reversal.
>>
>>
>> Let us suppose you are right.
Bruno Marchal writes:
>
>
> > We can assume that the structural difference makes a difference to
> > consciousness but
> > not external behaviour. For example, it may cause spectrum reversal.
>
>
> Let us suppose you are right. This would mean that there is
> substitution level such that t
Le 29-nov.-06, à 06:33, Stathis Papaioannou wrote:
> We can assume that the structural difference makes a difference to
> consciousness but
> not external behaviour. For example, it may cause spectrum reversal.
Let us suppose you are right. This would mean that there is
substitution level
Stathis Papaioannou wrote:
>
> David Nyman writes:
>
>> You're right - it's muddled, but as you imply there is the glimmer of
>> an idea trying to break through. What I'm saying is that the
>> 'functional' - i.e. 3-person description - not only of the PZ, but of
>> *anything* - fails to capture
David Nyman writes:
> You're right - it's muddled, but as you imply there is the glimmer of
> an idea trying to break through. What I'm saying is that the
> 'functional' - i.e. 3-person description - not only of the PZ, but of
> *anything* - fails to capture the information necessary for PC. Now
Stathis Papaioannou wrote:
>
> Colin Hales writes:
>
>>> I think it is logically possible to have functional equivalence but
>>> structural
>>> difference with consequently difference in conscious state even though
>>> external behaviour is the same.
>>>
>>> Stathis Papaioannou
>> Remember Dave
Quentin Anciaux writes:
> Le Mardi 28 Novembre 2006 00:00, Stathis Papaioannou a écrit :
> > Quentin Anciaux writes:
> > > But the point is to assume this "nonsense" to take a "conclusion", to see
> > > where it leads. Why imagine a "possible" zombie which is functionnally
> > > identical if the
Colin Hales writes:
> > I think it is logically possible to have functional equivalence but
> > structural
> > difference with consequently difference in conscious state even though
> > external behaviour is the same.
> >
> > Stathis Papaioannou
>
> Remember Dave Chalmers with his 'silicon repl
I was using David Chalmer's terminology. The science, however advanced it
might become, is the "easy problem". Suppose alien scientists discover that
human consciousness is caused by angels that reside in tiny black holes inside
every neuron. They study these angels so closely that they come t
Colin Geoffrey Hales wrote:
> > (And "analogue" physics might turn out to be digital)
> >
>
> Digital is a conceptual representation metaphor only.
Not necessarily.
http://en.wikipedia.org/wiki/Digital_physics
http://www.mtnmath.com/digital.html
--~--~-~--~~~---~
Colin Geoffrey Hales wrote:
> <[EMAIL PROTECTED]>
> <[EMAIL PROTECTED]>
> In-Reply-To: <[EMAIL PROTECTED]>
>
> Hi Brent,
> Please see the post/replies to Quentin/LZ.
> I am trying to understand the context in which I can be wrong and how
> other people view the proposition.
David Nyman wrote:
> 1Z wrote:
>
> > But PC isn't *extra* information It is a re-presentation of
> > what is coming in through the senses by 3rd person mechanisms.
>
> How can you be confident of that?
Because phenomenal perception wouldn't be perception otherwise.
Non-phenomenal sense data (pu
Le 27-nov.-06, à 02:31, David Nyman a écrit :
>
>
> On Nov 26, 11:50 pm, "1Z" <[EMAIL PROTECTED]> wrote:
>
>> Why use the word if you don't like the concept?
>
> I've been away for a bit and I can't pretend to have absorbed all the
> nuances of this thread but I have some observations.
>
> 1. To
1Z wrote:
> But PC isn't *extra* information It is a re-presentation of
> what is coming in through the senses by 3rd person mechanisms.
How can you be confident of that? We can see that transactional
information arrives in the brain and is processed in a 3-person
describable manner. We don't h
David Nyman wrote:
> For this to be what is producing PC, the instantiating, or
> constitutive, level must be providing whatever information is necessary
> to 'animate' 3-person transactional 'data' in phenomenal form, and in
> addition whatever processes are contingent on phenomenally-animated
On Nov 28, 10:17 am, Stathis Papaioannou
<[EMAIL PROTECTED]> wrote:
> This seems to me a bit muddled (though in a good way: ideas breaking surface
> at
> the limits of what can be expressed). If the duplicate is a "functional" one
> then
> there can't be any difference to its possible behavio
David Nyman writes:
> 1. To coherently conceive that a PZ which is a *functional* (not
> physical) duplicate can nonetheless lack PC - and for this to make any
> necessary difference to its possible behaviour - we must believe that
> the PZ thereby lacks some crucial information.
> 2. Such missi
>> the basic assumption of BIV I would see as flawed. It assumes that all
>> there is to the scene genreation is what there is at the boundary where
>> the sense measurement occurs.
>>
>> Virtual reality works, I think, because in the end, actual photons fly
>> at
>> you from outside. Actual phono
Colin Geoffrey Hales wrote:
> >
> >
> > Quentin Anciaux writes:
> >
> >> But the point is to assume this "nonsense" to take a "conclusion", to
> >> see
> >> where it leads. Why imagine a "possible" zombie which is functionnally
> >> identical if there weren't any dualistic view in the first place
Colin Hales writes:
> > Well, of course, we have a phenomenal view. Bu there is no informtion
> > in the phenomenal display that was not first in the pre-phenomenal
> > sensory data.
>
> Yes there is. Mountains of it. It's just that the mechanism and the need
> for it is not obvious to you. Som
Colin Hales writes:
> >> To bench test "a human" I could not merely
> >> replicate sensoiry feeds. I'd have to replicate the factory!
> >
> > As in brain-in-vat scenarios. Do you have a way of showing
> > that BIV would be able to detect its status?
>
> I think the BIV is another oxymoron like
>
> Do you mean you can have exact human external behavior replica without
> consciousness ? or with a different consciousness (than a human) ?
>
> If 1st case then if you can't find any difference between a real human and
> the
> replica lacking consciousness how could you tell the replica is lac
Hi,
Le Mardi 28 Novembre 2006 00:00, Stathis Papaioannou a écrit :
> Quentin Anciaux writes:
> > But the point is to assume this "nonsense" to take a "conclusion", to see
> > where it leads. Why imagine a "possible" zombie which is functionnally
> > identical if there weren't any dualistic view i
>
>
> Quentin Anciaux writes:
>
>> But the point is to assume this "nonsense" to take a "conclusion", to
>> see
>> where it leads. Why imagine a "possible" zombie which is functionnally
>> identical if there weren't any dualistic view in the first place ! Only
>> in
>> dualistic framework it is po
Quentin Anciaux writes:
> But the point is to assume this "nonsense" to take a "conclusion", to see
> where it leads. Why imagine a "possible" zombie which is functionnally
> identical if there weren't any dualistic view in the first place ! Only in
> dualistic framework it is possible to ima
>>
>> "If the mind is what the brain does, then what exactly is a coffee cup
>> doing?"
>
> It's not mind-ing.
>
>> For that question is just as valid and has just as complex an
>> answer...
>
> Of course not.
>
>> .yet we do not ask it. Every object in the universe is like this.
>> This is the m
Le 26-nov.-06, à 07:09, Colin Geoffrey Hales a écrit :
> I know your work is mathematics, not philosophy. Thank goodness! I can
> see
> how your formalism can tell you 'about' a universe. I can see how
> inspection of the mathematics tells a story about the view from within
> and
> without.
Colin Geoffrey Hales wrote:
> >
> >
> > Colin Hales writes:
> >
> >> The very fact that the laws of physics, derived and validated using
> >> phenomenality, cannot predict or explain how appearances are generated
> >> is
> >> proof that the appearance generator is made of something else and that
> The hard problem is not that we haven't discovered the physics that
> explains
> consciousness, it is that no such explanation is possible. Whatever
> Physics X
> is, it is still possible to ask, "Yes, but how can a blind man who
> understands
> Physics X use it to know what it is like to see?"
>
>
> Colin Hales writes:
>
>> The very fact that the laws of physics, derived and validated using
>> phenomenality, cannot predict or explain how appearances are generated
>> is
>> proof that the appearance generator is made of something else and that
>> something else else is the reality involve
Colin Hales writes:
> OK. There is a proven mystery called the hard problem. Documented to death
> and beyond. Call it Physics X. It is the physics that _predicts_ (NOT
> DESCRIBES) phenomenal consciousness (PC). We have, through all my fiddling
> about with scientists, conclusive scientific evi
Colin Hales writes:
> The very fact that the laws of physics, derived and validated using
> phenomenality, cannot predict or explain how appearances are generated is
> proof that the appearance generator is made of something else and that
> something else else is the reality involved, which is N
>
> Of course they are analogue devices, but their analogue nature makes no
> difference to the computation. If the ripple in the power supply of a TTL
> circuit were >4 volts then the computer's true analogue nature would
> intrude and it would malfunction.
>
> Stathis Papaioannou
Of course you
On Nov 26, 11:50 pm, "1Z" <[EMAIL PROTECTED]> wrote:
> Why use the word if you don't like the concept?
I've been away for a bit and I can't pretend to have absorbed all the
nuances of this thread but I have some observations.
1. To coherently conceive that a PZ which is a *functional* (not
phy
The discussion has run its course. It has taught me a lot about the sorts
of issues and mindsets involved.
It has also given me the idea for the methodological-zombie-room, which I
will now write up. Maybe it will depict the circumstances and role of
phenomenality better than I have thus far.
Me
Colin Geoffrey Hales wrote:
> >
> > Le Dimanche 26 Novembre 2006 22:54, Colin Geoffrey Hales a écrit :
> >
> >> What point is there in bothering with it. The philosophical zombie is
> >> ASSUMED to be equivalent! This is failure before you even start! It's
> >> wrong and it's proven wrong becaus
That's it. Half the laws of physics are going neglected merely because
we
won't accept phenomenal consciousness ITSELF as evidence of anything.
>>> We accept it as evidence of extremely complex neural activity - can you
>>> demonstrate it is not?
>>
>> You have missed the point agai
Colin Geoffrey Hales wrote:
> >>
> >> You are a zombie. What is it about sensory data that suggests an
> >> external world?
> >
> > What is it about sensory data that suggests an external world to
> > human?
>
> Nothing. That's the point. That's why we incorporate the usage of natural
> world pro
Everything in this weve been through already. All my answers are already in.
>
>
> Colin Geoffrey Hales wrote:
>> >> Colin
>> >> I'm not talking about invisibility of within a perceptual field. That
>> is
>> >> an invisibility humans can deal with to some extent using
>> instruments.
>> >> We
>>
>
> Le Dimanche 26 Novembre 2006 22:54, Colin Geoffrey Hales a écrit :
>
>> What point is there in bothering with it. The philosophical zombie is
>> ASSUMED to be equivalent! This is failure before you even start! It's
>> wrong and it's proven wrong because there is a conclusively logically
>> an
Colin Geoffrey Hales wrote:
>> Colin Geoffrey Hales wrote:
But you have no way to know whether phenomenal scenes are created by a
particular computer/robot/program or not because it's just mystery
property defined as whatever creates phenomenal scenes. You're going
around in c
>
> Colin Geoffrey Hales wrote:
>> <>
No confusion at all. The zombie is behaving. 'Wide awake'
in the sense that it is fully functional.
>>> Well, adaptive behaviour -- dealing with novelty --- is functioning.
>>
>> Yes - but I'm not talking about merely functioning. I am talking about
>
>
> Colin Geoffrey Hales wrote:
>> >> Scientific behaviour demanded of the zombie condition is a clearly
>> >> identifiable behavioural benchmark where we can definitely claim that
>> >> phenomenality is necessary...see below...
>> >
>> > It is all to easy to consider scientific behaviour withou
>
> What the zombie argument says (and I repeat it again) is that you SHOULD
> (if you are an honest rational person) accept ONE (and only
> one as they are contradictory proposition) of the following propositions:
>
> 1) Consciousness is not tied to a given behavior nor to a given physical
> attr
>
> Colin Geoffrey Hales wrote:
>>> But you have no way to know whether phenomenal scenes are created by a
>>> particular computer/robot/program or not because it's just mystery
>>> property defined as whatever creates phenomenal scenes. You're going
>>> around in circles. At some point you need
>>
>> You are a zombie. What is it about sensory data that suggests an
>> external world?
>
> What is it about sensory data that suggests an external world to
> human?
Nothing. That's the point. That's why we incorporate the usage of natural
world properties to contextualise it in the external wo
>>
>> Absolutely! But the humans have phenomenal consciousness in lieu of ESP,
>> which the zombies do not.
>
> PC doesn't magically solve the problem.It just involves a more
> sophisticated form of guesswork. It can be fooled.
We been here before and I'll say it again if I have to
Yes! It c
Le Dimanche 26 Novembre 2006 22:54, Colin Geoffrey Hales a écrit :
> What point is there in bothering with it. The philosophical zombie is
> ASSUMED to be equivalent! This is failure before you even start! It's
> wrong and it's proven wrong because there is a conclusively logically and
> empirica
Colin Geoffrey Hales wrote:
> <>
>>> No confusion at all. The zombie is behaving. 'Wide awake'
>>> in the sense that it is fully functional.
>> Well, adaptive behaviour -- dealing with novelty --- is functioning.
>
> Yes - but I'm not talking about merely functioning. I am talking about the
> spe
>>
>> Except that in time, as people realise what I just said above, the
>> hypothesis has some emprical support: If the universe were made of
>> appearances when we opened up a cranium we'd see them. We don't.
>
> Or appearances don't appear to be appearances to a third party.
>
Precisely. Now a
<>
>> No confusion at all. The zombie is behaving. 'Wide awake'
>> in the sense that it is fully functional.
>
> Well, adaptive behaviour -- dealing with novelty --- is functioning.
Yes - but I'm not talking about merely functioning. I am talking about the
specialised function called scientific b
Colin Geoffrey Hales wrote:
> >> Colin
> >> I'm not talking about invisibility of within a perceptual field. That is
> >> an invisibility humans can deal with to some extent using instruments.
> >> We
> >> inherit the limits of that process, but at least we have something
> >> presented to us fro
Colin Geoffrey Hales wrote:
> > a) Darwinian evolution b) genetic learning algorithm.
>
> None of which have any innate capacity to launch or generate phenomenal
> consciousness and BOTH of which have to be installed by humans a-priori.
The actual real process of evolution does have the capacity
Quentin Anciaux wrote:
> Hi,
> Le Dimanche 26 Novembre 2006 12:43, Colin Geoffrey Hales a écrit :
> > Note: Scientists, by definition:
> > a) are doing science on the world external to them
> > b) inhabit a universe of exquisite novelty
> >...or there'd be no need for them!
> Please note: Zom
Colin Geoffrey Hales wrote:
> >> Scientific behaviour demanded of the zombie condition is a clearly
> >> identifiable behavioural benchmark where we can definitely claim that
> >> phenomenality is necessary...see below...
> >
> > It is all to easy to consider scientific behaviour without
> > phen
1Z wrote:
>
> Brent Meeker wrote:
>
>> No, I think Colin has point there. Your phenomenal view adds a lot of
>> assumptions to the sensory data in constructing an internal model of what
>> you see. These assumptions are hard-wired by evolution. It is situations
>> in which these assumption
Hi,
Le Dimanche 26 Novembre 2006 12:43, Colin Geoffrey Hales a écrit :
> Note: Scientists, by definition:
> a) are doing science on the world external to them
> b) inhabit a universe of exquisite novelty
>...or there'd be no need for them!
Please note: Zombies by definition:
a) are functionnal
Brent Meeker wrote:
> No, I think Colin has point there. Your phenomenal view adds a lot of
> assumptions to the sensory data in constructing an internal model of what you
> see. These assumptions are hard-wired by evolution. It is situations in
> which these assumptions are false that pro
1Z wrote:
>
> Colin Geoffrey Hales wrote:
>> Stathis,
...
Whatever 'reality' is, it is regular/persistent,
repeatable/stable enough to do science on it via
our phenomenality and come
up with laws that seem to characterise how it will appear
to us in our phenomenality.
>>>
Colin Geoffrey Hales wrote:
>> But you have no way to know whether phenomenal scenes are created by a
>> particular computer/robot/program or not because it's just mystery
>> property defined as whatever creates phenomenal scenes. You're going
>> around in circles. At some point you need to anch
Colin Geoffrey Hales wrote:
> >>
> >> soyes the zombie can 'behave'. What I am claiming is they
> >> cannot do _science_ i.e. they cannot behave scientifically.
> >> This is a very specific claim, not a general claim.
> >
> > You're being unfair to the poor zombie robots. How could they
> > p
Colin Geoffrey Hales wrote:
> Stathis,
> I am answering all the mail in time order. I can see below you are making
> some progress! This is cool.
>
> > Colin Hales writes:
> >> >> So, I have my zombie scientist and my human scientist and
> >> >> I ask them to do science on exquisite novelty. What
Colin Geoffrey Hales wrote:
> You do NOT interpret sense data! In consciuous activity you interpret the
> phenomenal scene generated using the sense data.
But that is itself an interpetation for reasons you yourself have
spelt out. Sensory pulse-trains don't have any meaning in themselves.
>
Colin Geoffrey Hales wrote:
> >
> >
> > Colin Hales writes:
> >
> >> You are a zombie. What is it about sensory data that suggests an
> >> external world? The science you can do is the science of
> >> zombie sense data, not an external world. Your hypotheses
> >> about an external world would be
> > Colin Geoffrey Hales wrote:
> >
> >> BTW there's no such thing as a truly digital computer. They are all
> >> actually analogue. We just ignore the analogue parts of the state
> >> transitions and time it all so it makes sense.
> >
> > And if the analogue part intrudes, the computer has malf
Stathis:
>
> See my previous post, I'm also answering them in the order that I read
> them
> (otherwise I'll never get back to them).
>
> If your model is adequate, then it should allow you to implement a replica
> of what
> it is that you're modelling such that the replica behaves the same as the
>
> You seem to be implying that there is some special physics
> involved in living processes: isn't that skimming a little
> close to vitalism?. All I see is the chemistry
> of large organic molecules, the fundamentals of which are
> well understood, even if the level of complexity is beyond
> wh
>
> But you have no way to know whether phenomenal scenes are created by a
> particular computer/robot/program or not because it's just mystery
> property defined as whatever creates phenomenal scenes. You're going
> around in circles. At some point you need to anchor your theory to an
> operati
>
>
> Colin Hales writes:
>
>> You are a zombie. What is it about sensory data that suggests an
>> external world? The science you can do is the science of
>> zombie sense data, not an external world. Your hypotheses
>> about an external world would be treated
>> as wild metaphysics by your zombie
>
>
> Colin Hales writes:
>
>> > You're being unfair to the poor zombie robots. How could they
>> > possibly tell if they were in the factory or on the benchtop
>> > when the benchtop (presumably) exactly replicates the sensory
>> > feeds they would receive in the factory?
>> > Neither humans nor
See my previous post, I'm also answering them in the order that I read them
(otherwise I'll never get back to them).
If your model is adequate, then it should allow you to implement a replica of
what
it is that you're modelling such that the replica behaves the same as the
original, or
clo
1 - 100 of 176 matches
Mail list logo