Re: Zombies

2012-01-02 Thread Craig Weinberg
I agree completely. My view is that awareness/sense/detection extends
to all physical phenomena. It would not be possible for someone's arm
to raise unless something was aware of it, I'm just pointing out that
it is not necessary for 'us' to be conscious in order to function as
an organism as if we were.

Craig

On Dec 31 2011, 6:47 pm, Pierz  wrote:

> Stage hypnosis is one thing, but as a former psychotherapist who has
> used hypnotherapy, I can say that it is a great oversimplification to
> say that a hypnotic subject raises their hand without awareness. What
> actually occurs is dissociation, in which awareness is split, not
> absent. This has been experimentally demonstrated in cases of pain
> suppression. It was shown that the subjects did perceive the pain,
> although they did not do so consciously. Where such sub- or un-
> conscious awareness remains, one can't really speak of a philosophical
> zombie, rather one needs to acknowledge that consciousness is not the
> undifferentiated unity it is sometimes represented as being in
> philosophical discussion. It is layered, deeply structured and
> complex, and there are many grey areas.
>
> On Dec 29 2011, 12:59 am, Craig Weinberg 
> wrote:
>
>
>
>
>
>
>
> > On Dec 28, 1:22 am, meekerdb  wrote:
>
> > > On 12/27/2011 6:53 AM, Craig Weinberg wrote:
>
> > > >http://www.youtube.com/watch?v=ViJH5nHpn_c
>
> > > > We don't need awareness to behave like we are aware.
>
> > > How are you interpreting this?  That the people were not aware of Brown's 
> > > message, or that
> > > they weren't aware of raising their hands?  Or just that they were not 
> > > conscious of why
> > > they raised their hand...they had no narrative explanation?
>
> > I removed this post actually, because I wasn't familiar with Brown and
> > assumed that this was an honest demonstration of mass hypnosis. After
> > learning a bit more about him I'm not sure what it actually is. He may
> > very probably be using shills to raise their hands first and
> > demonstrate the power of conformity rather than hypnotic suggestion.
> > If this was an honest demonstration of suggestion, then yes, they had
> > no awareness of initiating their hand raising behavior. If not, then
> > it just shows how one kind of awareness - visual perception, can drive
> > behavior without logical motive (another kind of awareness).

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Zombies

2011-12-31 Thread Pierz
Stage hypnosis is one thing, but as a former psychotherapist who has
used hypnotherapy, I can say that it is a great oversimplification to
say that a hypnotic subject raises their hand without awareness. What
actually occurs is dissociation, in which awareness is split, not
absent. This has been experimentally demonstrated in cases of pain
suppression. It was shown that the subjects did perceive the pain,
although they did not do so consciously. Where such sub- or un-
conscious awareness remains, one can't really speak of a philosophical
zombie, rather one needs to acknowledge that consciousness is not the
undifferentiated unity it is sometimes represented as being in
philosophical discussion. It is layered, deeply structured and
complex, and there are many grey areas.

On Dec 29 2011, 12:59 am, Craig Weinberg 
wrote:
> On Dec 28, 1:22 am, meekerdb  wrote:
>
> > On 12/27/2011 6:53 AM, Craig Weinberg wrote:
>
> > >http://www.youtube.com/watch?v=ViJH5nHpn_c
>
> > > We don't need awareness to behave like we are aware.
>
> > How are you interpreting this?  That the people were not aware of Brown's 
> > message, or that
> > they weren't aware of raising their hands?  Or just that they were not 
> > conscious of why
> > they raised their hand...they had no narrative explanation?
>
> I removed this post actually, because I wasn't familiar with Brown and
> assumed that this was an honest demonstration of mass hypnosis. After
> learning a bit more about him I'm not sure what it actually is. He may
> very probably be using shills to raise their hands first and
> demonstrate the power of conformity rather than hypnotic suggestion.
> If this was an honest demonstration of suggestion, then yes, they had
> no awareness of initiating their hand raising behavior. If not, then
> it just shows how one kind of awareness - visual perception, can drive
> behavior without logical motive (another kind of awareness).

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Zombies

2011-12-28 Thread Craig Weinberg

On Dec 28, 1:22 am, meekerdb  wrote:
> On 12/27/2011 6:53 AM, Craig Weinberg wrote:
>
> >http://www.youtube.com/watch?v=ViJH5nHpn_c
>
> > We don't need awareness to behave like we are aware.
>
> How are you interpreting this?  That the people were not aware of Brown's 
> message, or that
> they weren't aware of raising their hands?  Or just that they were not 
> conscious of why
> they raised their hand...they had no narrative explanation?

I removed this post actually, because I wasn't familiar with Brown and
assumed that this was an honest demonstration of mass hypnosis. After
learning a bit more about him I'm not sure what it actually is. He may
very probably be using shills to raise their hands first and
demonstrate the power of conformity rather than hypnotic suggestion.
If this was an honest demonstration of suggestion, then yes, they had
no awareness of initiating their hand raising behavior. If not, then
it just shows how one kind of awareness - visual perception, can drive
behavior without logical motive (another kind of awareness).

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Zombies

2011-12-27 Thread meekerdb

On 12/27/2011 6:53 AM, Craig Weinberg wrote:

http://www.youtube.com/watch?v=ViJH5nHpn_c

We don't need awareness to behave like we are aware.



How are you interpreting this?  That the people were not aware of Brown's message, or that 
they weren't aware of raising their hands?  Or just that they were not conscious of why 
they raised their hand...they had no narrative explanation?


Brent

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: Zombies (was: Jack's partial brain paper)

2010-03-17 Thread Bruno Marchal


On 17 Mar 2010, at 20:32, Stephen P. King wrote:


Hi Bruno and Fellow Listers,


   As I have been following this conversation a question  
occurred to me, how is a Zombie (as defined by Chalmers et al.) any  
different functionally from the notion of other persons (dogs, etc.)  
that a Solipsist might have? They seem equivalent, both behaving  
exactly as a “real person would” yet having no consciousness or 1-p  
reality of their own. What am I missing here?



Strictly speaking, for a solipsist, all others are zombies. Indeed.

But you don't need to be solipsist to believe in zombie, or to believe  
the notion makes sense.


With comp, a perfect zombie, that is handling all counterfactuals,  
makes no sense at all, given that consciousness is associated with the  
mathematical abstract computation, and not to any relative  
implementation/incarnation.


A particular zombie can exist, like a fake policeman on a road, or a  
fake Rogerian psychoanalysts.


The other,s as *you* see it are zombie, if, like some doctor, you  
identify the first person with their body. Of course,  you can  
associate the person to its own first person subjectivity (on which  
you can bet), and its body just as *one* of its vehicle in the most  
probable (most common in UD*) computations. Cf the measure problem.


In this setting it is useful to conceive a body as a word or program,  
written in "natural" (chemical, electro-chromo-dynamical, ...comp- 
physical) language.  We are divine or natural hypotheses.


Bruno




Onward!

Stephen P. King



From: everything-list@googlegroups.com [mailto:everything-list@googlegroups.com 
] On Behalf Of Bruno Marchal

Sent: Wednesday, March 17, 2010 1:45 AM
To: everything-list@googlegroups.com
Subject: Re: Jack's partial brain paper


On 16 Mar 2010, at 19:29, Brent Meeker wrote:


On 3/16/2010 6:03 AM, Stathis Papaioannou wrote:
On 16 March 2010 20:29, russell standish   
wrote:


I've been following the thread on Jack's partial brains paper,
although I've been too busy to comment. I did get a moment to read the
paper this evening, and I was abruptly stopped by a comment on page 2:

"On the second hypothesis [Sudden Disappearing Qualia], the
replacement of a single neuron could be responsible for the vanishing
of an entire field of conscious experience. This seems antecedently
implausible, if not entirely bizarre."

Why? Why isn't it like the straw that broke the camel's back? When
pulling apart a network, link by link, there will be a link removed
that causes the network to go from being almost fully connected to
being disconnected. It need not be the same link each time, it will
depend on the order in which the links are removed.

I made a similar criticism against David Parfitt's Napoleon thought
experiment a couple of years ago on this list - I understand that
fading qualia is a popular intuition, but it just seems wrong to
me. Can anyone give me a convincing reason why the suddenly
disappearing qualia notion is absurd?

Fading qualia would result in a partial zombie, and that concept is
self-contradictory. It means I could be a partial zombie now,
completely blind since waking up this morning, but behaving normally
and unaware that anything unusual had happened. The implications of
this is that zombie vision is just as good as normal vision in every
objective and subjective way, so we may as well say that it is the
same as normal vision. In other words, the qualia can't fade and leave
the behaviour of the brain unchanged.


I think this is a dubious argument based on our lack of  
understanding of qualia.  Presumably one has many thoughts that do  
not result in any overt action.  So if I lost a few neurons (which I  
do continuously) it might mean that there are some thoughts I don't  
have or some associations I don't make, so eventually I may "fade"  
to the level of consciousness of my dog.  Is my dog a "partial  
zombie"?


A priori the dog is not a zombie at all. It may be like us after  
taking some strong psych-active substance, disabling it  
intellectually. If enough neurons are disabled, it may lose  
Löbianity, but not yet necessarliy consciousness. If even more  
neurons are disabled, it will lose the ability to manifest his  
consciousness relatively to you, and it will be senseless to  
attribute him consciousness, but from its own perspective it will be  
"another dog" or "another universal machine" in Platonia.





I think the question of whether there could be a philosophical  
zombie is ill posed because we don't know what is responsible for  
qualia.  I speculate that they are tags of importance or value that  
get attached to perceptions so that they are stored in short term  
memory.  Then, because evolution cannot redesign things, the same  
tags are used for internal thoughts that seem important enough to  
put in memory.  If this is the case then it might be possible to  
design a robot which used a different method of evaluating  
experience 

Re: Zombies (was: Jack's partial brain paper)

2010-03-17 Thread Stathis Papaioannou
On 18 March 2010 06:32, Stephen P. King  wrote:

>    As I have been following this conversation a question
> occurred to me, how is a Zombie (as defined by Chalmers et al.) any
> different functionally from the notion of other persons (dogs, etc.) that a
> Solipsist might have? They seem equivalent, both behaving exactly as a “real
> person would” yet having no consciousness or 1-p reality of their own. What
> am I missing here?

The problem of zombies is a version of the problem of other minds.


-- 
Stathis Papaioannou

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



RE: Zombies (was: Jack's partial brain paper)

2010-03-17 Thread Stephen P. King
Hi Bruno and Fellow Listers,

 

 

   As I have been following this conversation a question
occurred to me, how is a Zombie (as defined by Chalmers et al.) any
different functionally from the notion of other persons (dogs, etc.) that a
Solipsist might have? They seem equivalent, both behaving exactly as a “real
person would” yet having no consciousness or 1-p reality of their own. What
am I missing here?

 

Onward!

 

Stephen P. King

 

 

 

From: everything-list@googlegroups.com
[mailto:everything-l...@googlegroups.com] On Behalf Of Bruno Marchal
Sent: Wednesday, March 17, 2010 1:45 AM
To: everything-list@googlegroups.com
Subject: Re: Jack's partial brain paper

 

 

On 16 Mar 2010, at 19:29, Brent Meeker wrote:





On 3/16/2010 6:03 AM, Stathis Papaioannou wrote: 

On 16 March 2010 20:29, russell standish  
 wrote:
  

I've been following the thread on Jack's partial brains paper,
although I've been too busy to comment. I did get a moment to read the
paper this evening, and I was abruptly stopped by a comment on page 2:
 
"On the second hypothesis [Sudden Disappearing Qualia], the
replacement of a single neuron could be responsible for the vanishing
of an entire field of conscious experience. This seems antecedently
implausible, if not entirely bizarre."
 
Why? Why isn't it like the straw that broke the camel's back? When
pulling apart a network, link by link, there will be a link removed
that causes the network to go from being almost fully connected to
being disconnected. It need not be the same link each time, it will
depend on the order in which the links are removed.
 
I made a similar criticism against David Parfitt's Napoleon thought
experiment a couple of years ago on this list - I understand that
fading qualia is a popular intuition, but it just seems wrong to
me. Can anyone give me a convincing reason why the suddenly
disappearing qualia notion is absurd?


Fading qualia would result in a partial zombie, and that concept is
self-contradictory. It means I could be a partial zombie now,
completely blind since waking up this morning, but behaving normally
and unaware that anything unusual had happened. The implications of
this is that zombie vision is just as good as normal vision in every
objective and subjective way, so we may as well say that it is the
same as normal vision. In other words, the qualia can't fade and leave
the behaviour of the brain unchanged.
  


I think this is a dubious argument based on our lack of understanding of
qualia.  Presumably one has many thoughts that do not result in any overt
action.  So if I lost a few neurons (which I do continuously) it might mean
that there are some thoughts I don't have or some associations I don't make,
so eventually I may "fade" to the level of consciousness of my dog.  Is my
dog a "partial zombie"? 

 

A priori the dog is not a zombie at all. It may be like us after taking some
strong psych-active substance, disabling it intellectually. If enough
neurons are disabled, it may lose Löbianity, but not yet necessarliy
consciousness. If even more neurons are disabled, it will lose the ability
to manifest his consciousness relatively to you, and it will be senseless to
attribute him consciousness, but from its own perspective it will be
"another dog" or "another universal machine" in Platonia.

 






I think the question of whether there could be a philosophical zombie is ill
posed because we don't know what is responsible for qualia.  I speculate
that they are tags of importance or value that get attached to perceptions
so that they are stored in short term memory.  Then, because evolution
cannot redesign things, the same tags are used for internal thoughts that
seem important enough to put in memory.  If this is the case then it might
be possible to design a robot which used a different method of evaluating
experience for storage and it would not have qualia like humans - but would
it have some other kind of qualia?  Since we don't know what qualia are in a
third person sense there seems to be no way to answer that.

 

 

If the robot can reason logically and believes in the induction axioms, it
will be Löbian, and the 8 arithmetical hypostases will necessarily apply. In
that case, if you find Theaetetus' theory of knowledge plausible, then it is
plausible that it has a personhood, and its qualia are described by S4Grz1,
X1* and Z1*, whatever the means of storage are used.

 

Bruno

 

 

http://iridia.ulb.ac.be/~marchal/

 

 

 

-- 
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com.
To unsubscribe from this group, send email to
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at
http://groups.google.com/group/everything-list?hl=en.

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group