--- postbus <[EMAIL PROTECTED]> wrote:

> Dear Matt,
> 
> Thank you for your reply. I see your points; it might go the way you
> say. 
> 
> This would mean that the AI does NOT evolve it's value system into stage
> 6, social compassion. Enslavement or destruction means value system 3 or
> 4 at the most. Whereas many people, especially in wealthy nations, are
> in stage 5-7. Meaning that in terms of values, the AI would not have
> surpassed us at all, only in intelligence.
> 
> So I wonder, what do you propose we do to avoid our downfall? 

Don't build AI?

But "downfall" implies that extinction of homo sapiens is bad.  It is not bad
from the point of view of whatever replaces us at the top of the food chain,
any more than the mass extinctions that marked the boundaries between geologic
eras were bad.  That process eventually gave rise to humans.

We are about to undergo the third major shift in the evolutionary process, the
first being DNA based life 3 billion years ago and the second being language
and culture about 10,000 years ago.  We could stop it, but we won't.  The
economic incentives are too great.  A rational approach to the question would
mean overcoming the biases that have been programmed into our brains through
evolution and culture, things like belief in consciousness and free will, fear
of death, and morality.  We can't overcome these biases until we can reprogram
our brains, and by then it will be too late to turn back.

I have addressed some of these questions in
http://www.mattmahoney.net/singularity.html


-- Matt Mahoney, [EMAIL PROTECTED]

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&id_secret=75595170-9973e8

Reply via email to