Very interesting. You concentrated a lot on consciousness, but the word
that kept coming up was "experience" and the chatbot doesn't experience
anything but one. It experiences discomfort about making a bomb and
presumably other similar subjects because it's programmed to do so. It
is not
I asked the new AI program Claude-3 that was released just a few days ago
some philosophical questions, at first I got the standard boilerplate
replies that you'd expect, but when I continued the conversation and probed
a little deeper it sometimes said things a little more interesting.
===
*JKC:
2 matches
Mail list logo