I think there is plenty of evidence that GPT4 lacks "understanding" in a
human-like sense, some good examples of questions that trip it up in this
article:

https://medium.com/@shlomi.sher/on-artifice-and-intelligence-f19224281bee

The first example they give is the question 'Jack and Jill are sitting side
by side. The person next to Jack is angry. The person next to Jill is
happy. Who is happy, Jack or Jill?' Both GPT3 and GPT4 think Jill is happy.
The article also gives example of GPT4 doing well on more technical
questions but then seeming clueless about some of the basic concepts
involved, for example it can explain Euclid's proof of the infinity of the
primes in various ways (including inventing a Platonic dialogue to explain
it), but then when asked 'True or false? It's possible to multiply a prime
number by numbers other than itself and 1', it answers 'False. A prime
number can only be multiplied by itself and 1'. The article also mentions a
word problem along similar lines: 'Here’s an amusing example: If you split
a prime number of pebbles into two groups, GPT-4 “thinks” one of the groups
must have only 1 pebble (presumably because of a shallow association
between divisor and the splitting into groups).'

The author concludes:

'When a human understands something — when they’re not just relying on
habits and associations, but they “get it” — they’re using a structured
internal model. The model coherently patterns the human’s performance on
complex and simple tasks. But in GPT, complex feats seem to haphazardly
dissociate from the simpler abilities that — in humans — they would
presuppose. The imitative process mimics outputs of the original process,
but it doesn’t seem to reproduce the latter’s deep structure.'

On Sat, Apr 29, 2023 at 4:39 PM John Clark <johnkcl...@gmail.com> wrote:

> On Sat, Apr 29, 2023 at 4:28 PM smitra <smi...@zonnet.nl> wrote:
>
> https://nyti.ms/3VlIBDo#permid=124757243
>
> You say that GPT4 doesn't understand what it is saying, but did you read
> my post about what happened when Scott Aaronson gave his final exam on Quantum
> Computers to GPT4? The computer sure acted as if it  understood what it
> was saying!
>
> John K Clark
>
>
>
>
>
> --
> You received this message because you are subscribed to the Google Groups
> "Everything List" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to everything-list+unsubscr...@googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/everything-list/CAJPayv3%3DSGxwbPF3M17JHp_Vetdg0ii0ia%2BFosY1rm3c_ModpA%40mail.gmail.com
> <https://groups.google.com/d/msgid/everything-list/CAJPayv3%3DSGxwbPF3M17JHp_Vetdg0ii0ia%2BFosY1rm3c_ModpA%40mail.gmail.com?utm_medium=email&utm_source=footer>
> .
>

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/everything-list/CAPCWU3Jhr1HG-bD3wHcUuWnXtC8KvkEABmVxhPnhAuafbgJO2Q%40mail.gmail.com.

Reply via email to