Matt Mahoney wrote:
--- Samantha Atkins <[EMAIL PROTECTED]> wrote:
In http://www.mattmahoney.net/singularity.html I discuss how a
singularity
will end the human race, but without judgment whether this is good
or bad.
Any such judgment is based on emotion.
Really? I can think of arguments why this would be a bad thing
without even referencing the fact that I am human and do not wish to
die. That wish is not equivalent to an emotion if you consider it,
as you appear to have done above, as one of your deepest goals. Goal
per se do not equate to emotion.
I was equating emotion to those goals which are programmed into your brain, as
opposed to learned subgoals. For example, hunger is an emotion, but the
desire for money to buy food is not. In that context, you cannot distinguish
between good and bad without reference to hardcoded goals, such as fear of
death.
Matt,
This usage of "emotion" is idiosyncratic and causes endless confusion.
Hunger is not an emotion but a motivation. It is certainly true that
there is a grey area between the two, but in the case that you are
discussing here, it is clear that you are talking about motivations or
drives.
As for your larger point, I continue to vehemently disagree with your
assertion that "a singularity will end the human race".
As far as I can see, the most likely outcome of a singularity would be
exactly the opposite. Rather than the end of the human race, just some
changes to the human race that most people would be deleriously happy about.
Richard Loosemore
-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=88201613-566b59