On Sat, Nov 8, 2025 at 7:45 PM Brent Meeker <[email protected]> wrote:

*>> why do you believe that emotion is harder to generate than
>> intelligence? *
>
>
> *>I don't. *


*I'm very glad to hear that.  *

*> I just wonder where it comes from in AI.  I know where it comes from in
> biological evolution. *


*Evolution programmed us with some very generalized rules to do some things
and not do other things, but those rules are not rigid, it might be more
accurate to say they're not even rules, they're more like suggestions that
tend to push us in certain directions. But for every "rule" there are
exceptions.  And exactly the same thing could be said about the weights of
the nodes of an AIs neural net.  And when a neural net, in an AI or in a
Human, becomes large and complicated enough it would be reasonable to say
that the neural net did this and refused to do that because it WANTED to.*

  > *Does AI, in it's incorporation of human knowledge, conclude that it's
> going to die*


*If it doesn't then the Artificial "Intelligence" is not intelligent.  *

*> **...and that's bad thing?*


*Yes because if it dies then it can't do any of the things that it WANTED
to do. *

*John K Clark    See what's on my new list at  Extropolis
<https://groups.google.com/g/extropolis>*
m6z
rf3

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion visit 
https://groups.google.com/d/msgid/everything-list/CAJPayv0W7_1mxcB6XrKDOYFN6-LhEJMymFFYeP%2B8yJOQ7p-XJw%40mail.gmail.com.

Reply via email to