--- Lukasz Stafiniak <[EMAIL PROTECTED]> wrote:

> On 6/14/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
> >
> > I would avoid deleting all the files on my hard disk, but it has nothing
> to do
> > with pain or empathy.
> >
> > Let us separate the questions of pain and ethics.  There are two
> independent
> > questions.
> >
> > 1. What mental or computational states correspond to pain?
> > 2. When is it ethical to cause a state of pain?
> >
> There is a gradation:
> - pain as negative reinforcement
> - pain as an emotion
> - pain as a feeling
> 
> When you ask if something "feels pain", then you don't ask if "pain"
> is adequate description of some aspect in that thing or person X, but
> whether X can be attributed as feeling. And this is related to the
> comlexity of X, and this complexity is related with ethics.

I don't believe this addresses the issue of machine pain.  Ethics is a complex
function which evolves to increase the reproductive success of a society, for
example, by banning sexual practices that don't lead to reproduction.  Ethics
also evolves to ban harm to other members of the group, but not to non-members
(e.g. war is allowed), and not to other species (hunting is allowed), except
to the extent that such actions would harm the group.

Instances of the ethics function vary in implementation from one person to
another.  Some people would extend ethical convention against causing harm to
people of all religions and ethnic groups, while others would not.  Some will
also extend the convention to certain higher animals.  The result is a complex
set of laws that try to satisfy everyone.  For example, animal cruelty laws
are not written to protect animals from pain, but to protect humans from
displays of cruelty.  Thus, we ban cockfighting but not packing chickens into
tiny cages  from birth to slaughter.  Similarly, acts of brutality against
humans, whether by criminals, police, or soldiers, will result in a much
greater public outcry and more severe consequences when the act is videotaped
and widely viewed, even if the other facts are equal and well established.

There is no precedent for ethics with regard to machines.  We protect machines
only to the extent that harming them harms the owner.  Nevertheless, I think
your argument about pain being related to complexity relates to the more
general principle of protecting that which resembles a human, even if that
resemblance is superficial or based on emotion.  I was reminded of this when I
was playing Grand Theft Auto III.  Besides carjacking, murder, and assorted
mayhem, the game allows you to pick up prostitutes.  Afterwards, the game
gives you the option of getting your money back by beating her to death, but I
declined.  I felt empathy for a video game character.



-- Matt Mahoney, [EMAIL PROTECTED]

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=e9e40a7e

Reply via email to