On Mon, Feb 4, 2019 at 6:02 AM Stefan Reich via AGI <[email protected]>
wrote:

> > Many commentators here agreed (over time) how agi development requires
> a radically-different approach to all other computational endeavors to date.
>
> Not sure what that means. A really good NLU will go a very long way, and
> then we'll have to find a new "magic learner" module that replaces neural
> networks, both for image/audio recognition and learning logic. I suggest
> evolutionary algorithms.
>

Here's my "magic learner" proposal. Actually, it is much less than that; it
just shows how symbolic computing and neural net computing are two sides of
the same coin.  The idea is that once you see the correspondence, then you
have a clear path to the kind of symbolic computing that lots of people
want to do, and a way of uncloaking the "black box" aspects of neural nets.

https://github.com/opencog/opencog/raw/master/opencog/nlp/learn/learn-lang-diary/skippy.pdf

FYI, so far, everyone I have shown this to has replied by saying "I read it
but I skipped the math", which is an odd thing to do, since its essentially
a math paper.  The whole point is that, if you want to understand how
neural nets and symbolic learning can be placed on the same footing, then
you have to understand how both systems work, and "skipping the math" is
equivalent to "skipping the actual explanation".

(I used to have a non-technical way of explaining this, but everyone who
read that was underwhelmed.)

--linas

-- 
cassette tapes - analog TV - film cameras - you

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Ta6fce6a7b640886a-M1fd847aa6ec192f94535ddce
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to