On Fri, Jan 29, 2021, 1:51 AM YKY (Yan King Yin, 甄景贤) <
[email protected]> wrote:

>
> My proposed model has some important properties:
> 1.  uses deep learning to learn logic formulas


Do you have any experiments confirming this? It wasn't clear from your
paper how to achieve this.

The lack of an efficient learning algorithm has been the one most important
obstacle to symbolic AI in spite of decades of research. I realize that
humans are able to manually encode knowledge into structured formats like
first order logic and it's extensions like CYCL and probabilistic logic.
But this ability can't be central to learning in humans because it comes
after the knowledge is learned. You don't have to know the difference
between a noun and a verb to form grammatically correct sentences.

The 3 major obstacles to AGI, in my opinion:

1. Hardware. After decades of research, the best known solutions to vision,
language, and robotics use neural networks. A human brain sized network
with 6 x 10^14 synapses and 20 ms activation time requires 30 petaflops and
a petabyte of memory. Reducing power consumption from a few megawatts to 20
watts (what the brain uses) can't be done by shrinking transistors. Feature
sizes are already of the order of the spacing between dopant atoms in
silicon, 11 nm at 1 part per million. To reduce power further you need a
whole new computing technology based on moving atoms or molecules instead
of electrons.

2. Software. The human body has information content of 5 x 10^9 bits,
equivalent to 300 million lines of code (based on my compression tests of
the human genome and large software projects). That's doable at a cost of
USD $30 billion. There isn't a practical alternative to writing the code,
as human evolution cost 10^46 DNA copy operations of 10^37 bits over 3
billion years.

3. Knowledge collection. Human long term memory is only 10^9 bits according
to Landauer's recall tests of pictures and words. About 99% of this is
shared or online (based on the US labor department estimate that it costs
1% of lifetime earnings to replace an employee). The remaining 10^7 bits
takes several months to collect through speech and writing, assuming you
are willing to make your life memories public so you don't have to answer
the same questions over and over. Repeating for 8 billion people (assuming
the obvious application of automating labor worldwide) will cost several
months global GDP or about $50 trillion assuming unrestricted data sharing.

An efficient unsupervised learning algorithm for structured knowledge could
greatly reduce at least the hardware cost. But given the long history of
failure dating back to the 1950s, I'm skeptical.

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T54594b98b5b98f83-M0ebc23e306b1e66fe98b5a38
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to