On Tue, Jun 28, 2022 at 6:25 AM John Rose <johnr...@polyplexic.com> wrote:

> ...
> On Saturday, June 25, 2022, at 6:58 AM, Rob Freeman wrote:
>
> If all the above is true, the key question should be: what method could
> directly group hierarchies of elements in language which share predictions?
>
>
> Is this just intuition?
>

If you're asking whether it is just intuition that the grammar learned
might be different for each sentence, I would say it's more of a
hypothesis. I think there is evidence formal grammars for natural language
cannot be complete. So for me that motivates the hypothesis beyond the
point I would call it intuition.

As support for the hypothesis I cite the history of linguistics going back
to:

1) Chomsky's rejection of learning procedures and assertion any formal
system for natural language grammar must be innate, not least because what
is observed, contradicts:

Part of the discussion of phonology in ’LBLT’ is directed towards showing
that the conditions that were supposed to define a phonemic representation
... were inconsistent or incoherent in some cases and led to (or at least
allowed) absurd analyses in others." Frederick J. Newmeyer, Generative
Linguistics a historical perspective, Routledge 1996.

2) Sydney Lamb's counter that an alternative explanation was that the
problem was "the criterion of linearity".

3) More recent work, including that done by OpenCog's own Linas Vepstas,
that:

Vepstas, “Mereology”, 2020: "In the remaining chapters, the sheaf
construction will be used as a tool to create A(G)I representations of
reality. Whether the constructed network is an accurate representation of
reality is undecidable, and this is true even in a narrow, formal, sense."

4) Bob Coecke's work on "togetherness", and a "quantum" quality to grammar:

Coecke: "we argue for a paradigmatic shift from `reductionism' to
`togetherness'. In particular, we show how interaction between systems in
quantum theory naturally carries over to modelling how word meanings
interact in natural language."

And more simply, my own experience, that when learning grammar
(by distributional analysis), there is typically more than one,
contradictory, way to do it (which is just a power of sets, and to be
excluded, more than justified.)

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T5d6fde768988cb74-M385b2f935978c6bbd0e8d353
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to