The word "training" is problematic. If you mean "memorize an association
list of pairs" (e.g. faces+text-string) well, technically that is
"training" in the AI jargon file,  but it's of little utility for AGI.

The word "pattern" is problematic. Exactly what a "pattern" is, is ...
tricky. Much (most? almost all?) of my effort is about trying to define
"what  is a pattern, anyway". I'm not sure what you had in mind, when you
used that word.  (Its a tricky word. Everyone obviously knows what it
means, but how to turn it into an algorithmically graspable "thing"?)

--linas

On Sat, Jul 18, 2020 at 6:44 PM Dave Xanatos <[email protected]> wrote:

> "If you can't spot the pattern, you've not accomplished anything."
>
>
>
> Every significant – and truly useful - advance I've made on my own
> language apprehension code has been based on recognizing a pattern, and
> coding for it.  I fully agree.
>
>
>
> Can a neural network be trained on patterns instead of things?
>
>
>
> Can code designed to recognize – for example, faces (like eigenfaces) – be
> trained to instead recognize blocks of data that look the same, despite
> perhaps being in vastly dissimilar fields?
>
>
>
> Apologies if I'm intruding, or seem to be "out of my lane"… a popular
> buzzword these days.
>
>
>
> Dave – LONG time lurker…
>
>
>
>
>
>
>
> *From:* [email protected] <[email protected]> *On Behalf Of
> *Linas Vepstas
> *Sent:* Saturday, July 18, 2020 6:54 PM
> *To:* link-grammar <[email protected]>; opencog <
> [email protected]>
> *Subject:* [opencog-dev] Re: [Link Grammar] Sutton's bitter lesson
>
>
>
> Well yes. What's truly remarkable is how frequently that lesson has to be
> re-learned.  There are vast swaths of the AI industry that still have not
> learned it, and are deluding themselves into thinking that they've made
> bold progress, when they've gotten nowhere at all, and seem blithely
> unaware that they are repeating the same mistake... again.
>
>
>
> I refer, of course, to the deep-learning true-believers. They have made
> the fundamental mistake of thinking that their various network designs
> provide an adequate representation of reality.  How little do they seem to
> realize that all that code, running hand-tuned on some GPU is just, and I
> quote Sutton, here: "leveraged human understanding of the special
> structure of chess". Except, cross out "chess" and replace with
> "dimensional reduction" or "weight vector" or whatever buzzword-bingo is
> popular in the deep-learning field these days.
>
>
>
> I'm back again to insisting that "patterns matter". If you can't spot the
> pattern, you've not accomplished anything. Neural nets can't spot patterns.
> They're certainly interesting for various reasons, but, as an AGI
> technology, they are every bit a dead-end as the hand-crafted English
> link-grammar dictionary.
>
>
>
> This is one reason I'm sort of plinking away, working on unfashionable
> things. I'm thinking simply that they are more generic. and more powerful.
> But perhaps the problem is recursive: perhaps I'm just "leveraging my human
> understanding of the special structure of patterns", and will hit a wall
> someday.  For now, it seems that my wall is more distant.  If only I could
> convince others ...
>
>
>
> --linas
>
>
>
>
>
> On Sat, Jul 18, 2020 at 5:14 PM Paul McQuesten <[email protected]>
> wrote:
>
> Linas,
>
>
>
> I think this reinforces your view of learning from data, instead of adding
> more human-curated rules:
>
> http://incompleteideas.net/IncIdeas/BitterLesson.html
>
> --
> You received this message because you are subscribed to the Google Groups
> "link-grammar" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/link-grammar/464d1f92-00b7-4780-870a-2156229b4567o%40googlegroups.com
> <https://groups.google.com/d/msgid/link-grammar/464d1f92-00b7-4780-870a-2156229b4567o%40googlegroups.com?utm_medium=email&utm_source=footer>
> .
>
>
>
> --
>
> Verbogeny is one of the pleasurettes of a creatific thinkerizer.
>         --Peter da Silva
>
> --
> You received this message because you are subscribed to the Google Groups
> "opencog" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/opencog/CAHrUA36x8QBXGUg4f9BMw5StdhRu1WFjFr_9ySo_vZesMeZrTA%40mail.gmail.com
> <https://groups.google.com/d/msgid/opencog/CAHrUA36x8QBXGUg4f9BMw5StdhRu1WFjFr_9ySo_vZesMeZrTA%40mail.gmail.com?utm_medium=email&utm_source=footer>
> .
>
> --
> You received this message because you are subscribed to the Google Groups
> "opencog" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/opencog/002701d65d5d%244fdc07d0%24ef941770%24%40xanatos.com
> <https://groups.google.com/d/msgid/opencog/002701d65d5d%244fdc07d0%24ef941770%24%40xanatos.com?utm_medium=email&utm_source=footer>
> .
>


-- 
Verbogeny is one of the pleasurettes of a creatific thinkerizer.
        --Peter da Silva

-- 
You received this message because you are subscribed to the Google Groups 
"opencog" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/opencog/CAHrUA368NNEmjBsg_09%3DG7yJOdT2Ur%3DBMYvEZPFh2k_HiWNx7w%40mail.gmail.com.

Reply via email to