On Tuesday, June 28, 2022, at 12:31 AM, Rob Freeman wrote: > If you're asking whether it is just intuition that the grammar learned might > be different for each sentence, I would say it's more of a hypothesis. I > think there is evidence formal grammars for natural language cannot be > complete. So for me that motivates the hypothesis beyond the point I would > call it intuition.
Sorry, I meant that it sounds like an “intuition” mechanism that would be grouping hierarchies of elements in language which share predictions, if you were to technically describe a synthetic intuition. Not that it would be fully definitive as there may be many forms of intuition, some simpler some elaborate though I suppose it could be standardized such that “intuition” itself might be reduced to a representational model set of grammars… or sets of grammars. And how to represent “hypothesis” … so going from “intuition” => “hypothesis” and testing. I had read some of Vepstas, “Mereology”, very interesting I had done research in that direction… and Bob Coecke’s spidering and togetherness goes along with how I think about these things. The spidering though is a simplicity, a visual dimension reduction itself for symbolic communication coincidentally like a re-grammaring of representation. But I like it a lot, it's great, the ZX-calculus, etc.. ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T5d6fde768988cb74-M61c66b5c0819ea0cc5048223 Delivery options: https://agi.topicbox.com/groups/agi/subscription
