Thanks for the question Mike.

My opinion on understanding, which is still evolving:

Understanding is one aspect of a conscious being, closest to knowing, but
also including thinking, feeling, intuition and emotion.  Some claim these
are inseparable, or that separating them limits the types of knowledge that
can be represented, and I tend to gravitate in that direction.

We humans can know things we can't articulate, and we can be articulate
about things that are not really what we believe internally.
"Understanding" as a test for AGI might be a gentleman's agreement upon a
certain level of articulation being reached for the idea at hand.  We see
the first inklings of this in the XAI endeavors.  However, some if not all
understanding comes before the language needed to express it, and I think
this applies to both humans and AGI; therefore a more rudimentary framing
of understanding could be knowledge which gives the capability to inform an
action.  Still though, my opinion is that understanding, or realization of
knowing, is deeply intermingled with an idea of feeling.  Understanding may
be inseparable from other types of cognition.

In any case, not taking advantage of that capability to inform an action
can cause non-ideal situations for the individual, such as how I knew it
would rain today but did not think to roll up my car windows.
It could also be evident in pathologies where we say things we don't
believe, or believe things we wouldn't say: both of those situations give a
sense that feelings, possibly repressed, may be involved.

In our AGI system, the most fundamental type of understanding is identical
to knowledge which could be traced back to learning of patterns by a type
of Hebbian function.  In common parlance, we think of understanding as a
type of knowledge orders of magnitude higher than the level of sensory
spacetime co-occurrence; however the Hebbian learning analogues continue at
recursively higher levels on the pattern matchers themselves. This is
something still under development. One of the many challenges we're facing
here is the level at which we allow higher level patterns to be brought
into the simulation space for manipulations (thinking about it's own
understanding of the pattern).  The lower levels of pattern recognition are
not of the same types as the higher level patterns, and the difference
between them delineates the conscious from the subconscious.  We also have
not incorporated the concepts of hedonic feeling, however there are some
simple proxies for intuition based in familiarity with a pattern.

In an analogy, commonly seen in optical illusions, a layperson cannot
understand nor explain why we see black dots on a grid illusion.  Certain
pattern matchers which were triggered in the optical processing region are
not able to be brought into question or simulation.  There are areas of our
own being which we are helplessly unable to understand via articulation.
By this I argue that an AGI could (should?) be average-human-level in
understanding without the need of "understanding" it's full being.

In a 1977 paper by Louis Arnaud Reid in the Proceedings of the Aristotelian
Society #77 "Thinking, Feeling, Knowing" doi 10.1093/aristotelian/77.1.165
, the author gives a (IMO) great argument about how "feeling" (not
necessarily those marked by noticeable hedonic tones) are an inseparable
part of our "knowing" something, both before and after we proclaim to know
that something.




On Fri, Jul 9, 2021 at 2:03 PM Mike Archbold <jazzbo...@gmail.com> wrote:

> You've got an opinion. We all do!
> 
> I'm doing a survey of opinions about "understanding" for the meetup
> -->
> https://www.meetup.com/Northwest-Artificial-General-Intelligence-Meetup-Group/
> 
> 2 events are envisioned this summer:
> 
> 1)  Survey -- discuss tribal opinions in the AGI community  as well as
> published works about what "understanding" means for a machine,
> 
> 2) Critiques and Conclusions -- compare, generalize, hopefully reach
> some conclusions.
> 
> So what is your definition of "understanding"? I have collected about
> a dozen so far and will publish along with the events.
> 
> We are also having an in person event this month for those around
> western Washington:
> 
> https://www.meetup.com/Northwest-Artificial-General-Intelligence-Meetup-Group/events/279258207/
> 
> Thanks Mike Archbold


-- 
Daniel Jue
Cognami LLC
240-515-7802
www.cognami.ai

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tf91e2eafa2515120-M7b79adfd1946b918a132547a
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to