On Fri, Sep 26, 2014 at 7:45 AM, Tim Tyler via AGI <[email protected]> wrote:
> On 25/09/2014 22:22, Matt Mahoney via AGI wrote:
>
>> 1. Intelligence depends on knowledge and computing power. A program
>> that rewrites itself cannot gain Kolmogorov complexity. Therefore,
>> self improvement will come from acquiring hardware and learning from
>> the environment.
>
> One problem with this is that short programs can clearly
> produce high Kolmogorov complexity if given enough runtime.
> Consider a simple counter. Run it for long enough and it
> will count from 1 to:
>
> 154998464670145002987498798679541316549415641078895144891513.
>
> This is a sequence with considerable Kolmogorov complexity.

The growth rate is log(t). Or if you have to specify how long it will
run by giving t as an input, then there is no growth at all.

> We have compact specifications of intelligence - such as AIXI.
> Expanding these short descriptions into practical agents is
> possible, but time consuming - due to the need to search a
> large search space in order to find them.

To be precise, intelligence = log(knowledge) + log(computing power),
where intelligence is measured in dollars per hour, knowledge in bits,
and computing power in operations per second times bits of memory. As
supporting evidence, I offer the roughly linear rate of economic
growth over the last few centuries (relative to the price of food),
compared to the exponential growth of both knowledge and computing
power, which double every few years.

Powerful theoretical agents like AIXI on a Turing machine have
infinite computing power and therefore have infinite intelligence with
no need for prior knowledge. But since the universe has finite
computing capacity, we don't build AI this way. We use a 3 way trade
off of processing speed, memory, and program complexity.

For example, evolution is at the extreme end of low knowledge. It
produced human civilization using a simple fitness function, but it
required 10^48 DNA base operations and 10^50 amino acid operations on
10^37 bits of DNA memory, consuming 10^16 watts of solar power for 3.5
billion years at close to the thermodynamic limit of energy
efficiency. We could speed this up by a factor of 10^9 by building a
Dyson sphere out to 100 AU (where cooling to 30K would reduce power by
90%). Anything much beyond this requires either interstellar travel or
speeding up the Sun's rate of fusion.

But we won't build AGI that way when we can build something more
complex that requires less computation. IBM did not build Watson as a
general purpose learner, for example, a genetic algorithm with
Jeopardy scores as a fitness function. Rather, it was a complex
program (30 person-year effort) without an explicit goal, running on a
few thousand processors. Since the cost of knowledge and computing
power both scale nearly linearly, the formula for intelligence
suggests you want to spend roughly equal amounts on each, which is
what they did.

Likewise, the most intelligent computing system in the world, the
internet, does not have a goal either. Reinforcement learning is slow
because the information content of the signal is low. It is much
faster to teach a system using words.

This has implications for an intelligence explosion. I described the
limits on computing power. Also, the exponential growth of knowledge
is really just the result of the growth of technology needed to store
and transmit it: language, writing, the printing press, the telegraph,
telephone, radio, television, computers, and internet. Now we are
close to collecting all of the 10^17 bits of human knowledge stored in
our collective brains and making it instantly available to everyone.
Then where will additional knowledge come from? No agent can make
another agent that knows more than itself. Evolution is notoriously
slow, transmitting only one bit per generation to the genome. We can
collect enormous amounts of data from sensors, but for what purpose
would it be applied if our brains lack sufficient speed and memory to
use it?

-- 
-- Matt Mahoney, [email protected]


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to