Almost, it reflected things that the user wrote, but not always the most recent 
ones. That's part of what fooled people.


--
Shmuel (Seymour J.) Metz
http://mason.gmu.edu/~smetz3

________________________________________
From: IBM Mainframe Discussion List [[email protected]] on behalf of Bob 
Bridges [[email protected]]
Sent: Monday, April 10, 2023 5:56 PM
To: [email protected]
Subject: Re: AI wipes out humanity?

Is that the one that imitated a therapist who asked questions that are
merely reflections of the last thing the user wrote?  "I hear you saying
you're angry at your mother.  Can you say why?"  "How did that make you
feel?"  Like that.  I encountered that in college and found it surprisingly
convincing.

---
Bob Bridges, [email protected], cell 336 382-7313

/* If you can meet with Triumph and Disaster / And treat those two impostors
just the same...
  -from _If_ by Rudyard Kipling */

-----Original Message-----
From: IBM Mainframe Discussion List <[email protected]> On Behalf Of
Seymour J Metz
Sent: Monday, April 10, 2023 17:33

...what is the threshold that we must cross before "AI" becomes a reality
rather than an aspiration? In the early days, we were 5 years away; in
retrospect, that was unduly optimistic.

On the flip side, does anybody remember the shockingly simple ELIZA
<https://en.wikipedia.org/wiki/ELIZA>, which fooled many people?

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [email protected] with the message: INFO IBM-MAIN

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [email protected] with the message: INFO IBM-MAIN

Reply via email to