On Thu, May 9, 2024 at 2:15 AM Rob Freeman <[email protected]>
wrote:

> On Thu, May 9, 2024 at 6:15 AM James Bowery <[email protected]> wrote:
> ...
> Criticisms are welcome. But just saying, oh, but hey look at my idea
> instead...
>

I may have confused you by conflating two levels of abstraction -- only one
of which is "my idea" (which isn't my idea at all but merely an idea that
has been around forever without garnering the attention it deserves):

1) Abstract grammar as a prior.
2) The proper structure for incorporating priors,  whatever they may be.

Forget about #1.  That was just an example -- a conjecture if you will --
that I found appealing as an under-appreciated prior but  distracted from
the much more important point of #2 which was about priors in general.

#2 is exemplified by the link I provided to physics informed machine
learning
<https://www.youtube.com/playlist?list=PLMrJAkhIeNNQ0BaKuBKY43k4xMo6NSbBa>
which
is appropriate to bring up in the context of this particular post about the
ir/relevance of physics.  The point is not "physics". Physics is merely one
knowledge domain that, because it is "hard", is useful because the
technique of incorporating its priors into machine learning is exemplary.

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Teaac2c1a9c4f4ce3-M4a92b688c0804deb6a6a12a1
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to