On Wed, Mar 6, 2024 at 7:56 PM Brent Meeker <[email protected]> wrote:
* > Very interesting. You concentrated a lot on consciousness, but the > word that kept coming up was "experience"* > I don't see the distinction. Nothing can experience something unless it has consciousness and nothing can be conscious unless it can have experiences. > *> and the chatbot doesn't experience anything but one. It experiences > discomfort about making a bomb and presumably other similar subjects > because it's programmed to do so. It is not part of it's learning. * > I would maintain that the most fundamental basic human emotions are also not part of learning but are just the way our brains are wired up which was dictated by our genes. I'm talking about fundamental complementary emotions like pleasure/pain, fear/hate, empathy/sadism, happiness/ sadness. Nobody learns to have those emotions they just do, although later we can learn to have more complex composite emotions like love, a composite of happiness and empathy. The human brain is a big neural network and so are modern AIs, so I see no reason why an AI would not develop similar composite emotions, in fact I cannot see any way it could not. > * > So I'd say the big difference between human consciousness and the > chatbot's is that the chatbot has almost no emotions or values, no > experience,* > That certainly would be a big difference if it was true but I see no way it could be unless the long discredited theory of vitalism turns out to be correct and human beings and only human beings possess some magic secret sauce that the rest of the physical world could never hope to understand, much less duplicate. *> and no subconscious. * > If an AI has no subconscious then I guess it only has consciousness. > > * > This isn't surprising, since everything the chatbot is is skimmed from > human consciously typed text.* > No, not everything it learned came from human consciously typed text, not even almost everything. It may sound a bit like perpetual motion or pulling yourself up by your own bootstraps but to everybody's surprise It turns out that artificially produced text made by an AI is extremely helpful in teaching language to AIs and as a teaching aid it works about as well as human text. But an AI needs to be good at more than just language, it also needs to intuitively understand how humdrum everyday physics works in the real world if it is to successfully interact with it. That's why I was so impressed with another OpenAI program "Sora"; you give it a few words of text describing some action produced by something and it is able to output a photo realistic video of that action up to a minute long or so. SOEA: Creating video from text <https://openai.com/sora> John K Clark See what's on my new list at Extropolis <https://groups.google.com/g/extropolis> sim > -- You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/CAJPayv1PK%3D_jHH_K3YAy14CmbjKX8aBJrA%2BG3cL7TmX26YskQQ%40mail.gmail.com.

