On 7/12/2024 3:15 AM, John Clark wrote:


On Thu, Jul 11, 2024 at 8:09 PM Brent Meeker <[email protected]> wrote:

    > /In case you've forgotten, the Turing test was based on text only
    communication between an interlocutor asked to distinguish between
    a computer pretending to be a human and a man or woman pretending
    to be a woman or man./


Yes but that is an unimportant detail, the essence of the Turing Test is that whatever method you use to determine the consciousness or lack of it in one of your fellow human beings you should use that same method when judging the consciousness of a computer.

    /> It's already been passed by some LLM's by dumbing-down their
    response/.


Don't you find that fact to be compelling? An AI needs to play dumb in order to fool a human into thinking itis human.

No, it only passed because the human interlocutor didn't ask the right questions; like, "Where are you?" and  "Is it raining outside?".

Now I think an LLM could be trained to imagine a consistent model of itself as a human being, i.e. having a location, having friends, motives, a history,...which would fool everyone who didn't actually check reality.

Brent

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/2725c000-1985-4649-a1a2-f76e9ea64414%40gmail.com.

Reply via email to