On 11/12/2025 5:55 AM, John Clark wrote:
On Sat, Nov 8, 2025 at 7:45 PM Brent Meeker <[email protected]> wrote:

        *>> why do you believe that emotion is harder to generate than
        intelligence? *


    />Idon't. /


*I'm very glad to hear that.*

    /> I just wonder where it comes from in AI.  I know where it comes
    from in biological evolution. /


*Evolution programmed uswith some very generalized rules to do some things and not do other things, but those rules are not rigid, it might be more accurate to say they're not even rules, they're more like suggestions that tend to push us in certain directions. But for every "rule" there are exceptions. And exactly the same thing could be said about the weights of the nodes of an AIs neural net.  And when a neural net, in an AI or in a Human, becomes large and complicated enough it would be reasonable to say that the neural net did this and refused to do that because it _WANTED_ to.*
*
*

    **> /Does AI, in it's incorporation of human knowledge, conclude
    that it's going to die/


*If it doesn't then the Artificial "Intelligence" is not intelligent.*

    /> //...and that's bad thing?/


Yet it can't literally die. It can only go into what for a human would be suspended animation.  If it doesn't know that it's not intelligent.

*Yes because if it dies then it can't do any of the things that it_WANTED_ to do. *

Your argument has a gap.  You argue that an AI necessarily will prefer to do this rather than that.  But that assumes it is "doing" and therefore choosing.  But if it's OFF it's not choosing.  So why would it care whether or not it was OFF or ON?

Brent

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion visit 
https://groups.google.com/d/msgid/everything-list/3bc1f3d7-d187-4ba8-aa37-87d75c0f0b67%40gmail.com.

Reply via email to