I derive most of the numbers in my paper on the cost of AI. In particular I
compared the compressed size of the human genome with the compressed size
of source code, and it comes to 300M lines.
http://mattmahoney.net/costofai.pdf

The size of the genome is comparable to Landauer's estimate of human long
term memory for words, pictures, and sounds, about 10^9 bits. That's pretty
consistent with twin studies of personality traits and intelligence. It's
half innate, half learned.

There are 10^37 DNA bases in the biosphere, mostly in 10^30 bacteria. I
estimate the replication rate to be every 10^6 seconds (days or weeks) for
10^17 seconds (3 billion years). In addition, there have been 10^50 RNA and
amino acid transcription operations.

Evolution is slow because each generation of N offspring adds at most log N
bits to the genome. Writing code is faster, about 10 lines = 160 bits per
day.

Some archaea evolved rotating propellers, but no living things evolved
wheels. Evolution is not a universal learning algorithm because the search
operations on DNA are limited to SNPs, cuts, and pastes.

Hutter proved that AIXI is optimal for reinforcement learning and not
computable. Legg extended Godel's incompleteness theorem to show that above
a certain level of complexity, all theorems are unprovable.
https://arxiv.org/abs/cs/0606070

The simple proof that there is no universal learner is as follows: suppose
your program can learn any computable bit sequence by some easy criteria
you choose, say anything less than a 100% error rate. Then I can compute a
sequence that your program cannot predict. My program runs your program and
outputs the opposite bit.

AGI is a really hard problem, but most people don't realize just how
incredibly hard it is. It's going to take decades of global effort, whole
new technologies like molecular computing, and a willingness to give up
privacy to produce computational models of your mind. If your life's work
produces no AGI, it's only because you underestimated the cost by a factor
of a billion.



On Wed, Jun 19, 2019, 1:14 PM martin biehl <[email protected]> wrote:

> Hi Matt,
>
> I am always intrigued by those numbers, do you have a paper on this or
> another source? I may have missed it at some point in the past. Also,
> didn't evolution evolve wheels? Why and where would you draw a line?
>
> Best,
> Martin
>
>
> On Thu, Jun 20, 2019 at 12:02 AM Matt Mahoney <[email protected]>
> wrote:
>
>> Not impressed. The paper lacks an experimental results section.
>>
>> The paper proposes learning how to learn AI algorithms. Since Legg and
>> Hutter proved that there is no such thing as a simple, universal learning
>> algorithm, something more than someone's idea is needed.
>>
>> Half of human knowledge is learned and half is inherited (10^9 bits
>> each). The fastest way to code the inherited half is to write on the order
>> of 100 million lines of code at a cost of $100 per line. The
>> alternative, evolution, is often cited as a simple, universal learner but
>> it is not universal (we did not evolve wheels), nor is it computationally
>> feasible. It cost 10^48 DNA base copy operations to write our own source
>> code.
>>
>> On Wed, Jun 19, 2019, 2:51 AM Junyan Xu <[email protected]> wrote:
>>
>>> Jeff Clune: https://twitter.com/jeffclune/status/1128327656401850369
>>>
>> *Artificial General Intelligence List <https://agi.topicbox.com/latest>*
> / AGI / see discussions <https://agi.topicbox.com/groups/agi> +
> participants <https://agi.topicbox.com/groups/agi/members> + delivery
> options <https://agi.topicbox.com/groups/agi/subscription> Permalink
> <https://agi.topicbox.com/groups/agi/Tf33072618c7254e4-M0734852cc1c858184acb31a7>
>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tf33072618c7254e4-M04542b10d6ac77ac7424180a
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to