Matt, 

With regard to your first point I largely agree with you.  I would, however,
qualify it with the fact that many of us find it hard not to sympathize with
people or animals, such as a dog, under certain circumstances when we
directly sense outward manifestations that they are experiencing terrible
pain, unless we have a sufficient hatred toward them to compensate for our
natural tendency to feel sympathy for them.  Some people attribute this to
mirror neurons, and the fact that we evolved to be tribal social animals.

With regard to the second point, your statement does not refute my point,
although my point is admittedly based on belief that is far from certain.
Our understanding of the physical (such as neural) correlates of conscious
is currently sufficiently limited that it does not yet let us say much about
the consciousness or lack thereof of the systems you describe, even if one
assumes they are totally understood in terms of things other than the
knowledge of the physical correlates of consciousness that we currently
don't have, but will have within fifty years.

But from what little we do understand about the neural correlates of
consciousness, it does not seem that either system you describe would have
anything approaching a human consciousness, and thus a human experience of
pain, since they lack the type of computation normally associated with
reports by humans of conscious experience.

Ed Porter

-----Original Message-----
From: Matt Mahoney [mailto:[EMAIL PROTECTED] 
Sent: Monday, November 17, 2008 4:45 PM
To: [email protected]
Subject: RE: FW: [agi] A paper that actually does solve the problem of
consciousness--correction

--- On Mon, 11/17/08, Ed Porter <[EMAIL PROTECTED]> wrote:
>First, it is not clear "people
>are free to decide what makes pain "real"," at least
>subjectively real.

I mean that people are free to decide if others feel pain. For example, a
scientist may decide that a mouse does not feel pain when it is stuck in the
eye with a needle (the standard way to draw blood) even though it squirms
just like a human would. It is surprisingly easy to modify one's ethics to
feel this way, as proven by the Milgram experiments and Nazi war crime
trials.

>If we have anything close to the advances in brain scanning and brain
science
>that Kurzweil predicts 1, we should come to understand the correlates of
>consciousness quite well

No. I used examples like autobliss (
http://www.mattmahoney.net/autobliss.txt ) and the roundworm c. elegans as
examples of simple systems whose functions are completely understood, yet
the question of whether such systems experience pain remains a philosophical
question that cannot be answered by experiment.

-- Matt Mahoney, [EMAIL PROTECTED]


-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription:
https://www.listbox.com/member/?&;
Powered by Listbox: http://www.listbox.com



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=120640061-aded06
Powered by Listbox: http://www.listbox.com

Reply via email to