--- Richard Loosemore <[EMAIL PROTECTED]> wrote:

> Matt,
> 
> This usage of "emotion" is idiosyncratic and causes endless confusion.

You're right.  I didn't mean for the discussion to devolve into a disagreement
over definitions.

> As for your larger point, I continue to vehemently disagree with your 
> assertion that "a singularity will end the human race".
> 
> As far as I can see, the most likely outcome of a singularity would be 
> exactly the opposite.  Rather than the end of the human race, just some 
> changes to the human race that most people would be deleriously happy about.

These are the same thing.  Happiness is just a matter of reprogramming the
brain.

Or maybe we disagree on what is "human"?

A singularity is an optimization process whose utility function is the
acquisition of computing resources.  It could be a Dyson sphere with atomic
level computing elements.  It may or may not have a copy of your memories.  It
won't always be happy, because happiness is not fitness.




-- Matt Mahoney, [EMAIL PROTECTED]

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=88283106-657d3a

Reply via email to