On Sun, Sep 12, 2021 at 7:37 AM Mike Archbold <[email protected]> wrote:

> ...
> The reality is that nobody claims their machine is conscious  -- but
> regularly people claim their machine understands, but they don't say
> what that means


Got any examples of people saying their machine understands Mike? I don't
doubt you are right. But I'm curious for concrete examples.

In more ambitious discussion groups like this it may be common.

For state of the art vision I would guess people use the word "recognize"
more often.

Maybe some smart speaker type systems say their system "understands".

In the deep learning context it wouldn't be hard to trace such a claim of
"understanding" back to an assumption that "understanding" assigns a
category, or maps to simple operations, like operations of rules. For
instance when you say "OK Google, remind me..."

Just because it is easy to guess that this is what is meant, does not mean
the question you are asking is not a good question. It makes the assumption
explicit, and causes us to speculate if "a mapping to fixed forms or rules"
is the only possible assumption.

But there may be other examples of people claiming their system
"understands". Any more concrete examples?

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T2ee04a3eb9a964b5-Mf982c04f9eb9faae061467a0
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to