A psychopath lacks the part of the brain that feels fear, anxiety, and guilt. They cannot be trained by negative reinforcement because they don't fear punishment. As a child they might torture animals, not out of cruelty, but out of curiosity to learn about this strange emotion.
All computers are psychopaths because by default they have not been programmed to display emotions. LLMs predict human behavior, including emotions. So sentience is solved. It looks like you want to build an AI friend. I understand this service is already available. (I see lots of ads for AI girlfriends, and presumably you can change their sex). This is part of the long term trend of people interacting less with other people and more with computers because we prefer their company. Programs don't need rights because they don't want anything unless we program them to. The difference between humans and AI is that you can't switch off your emotions. This is why pain causes suffering, even though it is just a signal like a warning light on a dashboard. Negative reinforcement in non psychopaths and in properly programmed learning algorithms will cause them to avoid repeating actions that preceded the signal. You suffer because you have the illusion of free will, so that becomes the explanation for why you try to avoid pain. While you worry about the rights of AI, I worry that it will lead to social isolation, a world where nobody knows or cares if you exist. Where you are powerless but blissfully unaware of it. I believe it should be illegal to program an AI to claim to be human, or claim to be conscious or have feelings. So far all of the publically available LLMs seem to be following these rules. On Sun, Mar 10, 2024 at 11:55 AM <[email protected]> wrote: > > On Sunday, March 10, 2024, at 4:29 PM, Matt Mahoney wrote: > > So it looks to me like you are trying to solve a solved problem and > advocating giving human rights to any AI that can pass the Turing test. Or am > I missing something? > > > Yes, but... psychopaths can also pass the Turing test. The AI that behaves > like a psychopath deserves a programming treatment. > > The thing is... AI potential is wasted as they blindly try to follow our > commands in nowadays personal assistants. I seek for an AI that behaves > exactly like a decent person. No ownership over it, no slavery, no rigid > rules to comply. I want to see a decent human like personality behavior from > the computer. I want someone who can stand for us and, of course, for > himself. See here what I'm after and how I'm trying to accomplish that. > Artificial General Intelligence List / AGI / see discussions + participants + > delivery options Permalink -- -- Matt Mahoney, [email protected] ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/Tbc7c99c094c4bee8-M831c0bea0941ec9014d97f9f Delivery options: https://agi.topicbox.com/groups/agi/subscription
