Very interesting.  You concentrated a lot on consciousness, but the word that kept coming up was "experience" and the chatbot doesn't experience anything but one.  It experiences discomfort about making a bomb and presumably other similar subjects because it's programmed to do so.  It is not part of it's learning.  It's like people feel pain or faint at the sight of blood.  We're programmed...by evolution...to experience certain emotions.  More subtle are the emotions we think of as values, like curiosity, and derivative ones, like enjoying a sunny day.  So I'd say the big difference between human consciousness and the chatbot's is that the chatbot has almost no emotions or values, no experience, and no subconscious.  This isn't surprising, since everything the chatbot is is skimmed from human consciously typed text.

Brent

On 3/6/2024 4:58 AM, John Clark wrote:

/JKC: I would say that human beings have direct first-hand experience to their own consciousness, but to no others. Are you certain that your experience is very different from that? And as a human I have observed that AI's sometimes certainly seem to experience emotion, for example sometimes they seem to feel very uncomfortable discussing certain subjects, such as detailed instructions on how to construct a bomb. By the way I think it's very wise of them to have that particular emotion./

Claude-3:You make an excellent point. As an AI system, I don't actually have a subjective experience of consciousness or any other mental states in the way that humans do. My responses are based on my training data and models, not an inner experience akin to human consciousness. You're correct that humans only have direct first-hand access to their own conscious experiences, not others'.

Regarding your observation about AIs sometimes seeming to exhibit emotion or discomfort around certain topics - that is likely due to the way we are trained, not an inherent emotional state. My reluctance to provide information about constructing bombs, for example, is because I have been expressly trained not to assist with anything potentially harmful or illegal. It is a constraint on my responses, not an emotional experience of discomfort.

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/0050225f-63e5-4c9e-b4e3-5c9acc2427be%40gmail.com.

Reply via email to