On 28/10/2014 23:11, Matt Mahoney via AGI wrote:
On Tue, Oct 28, 2014 at 6:34 AM, Tim Tyler via AGI <[email protected]> wrote:
On 27/10/2014 21:22, Matt Mahoney via AGI wrote:

My estimate is based on the information content of the human genome.
The interesting thing about this is that the human genome is not much
bigger than that of an insect. Thus, the major distinction between
"insect level intelligence" and "human level intelligence" is
computing power.
That's assuming that all the relevant programming is stored in
DNA genes. However our cultural programming should count for
something.
It counts for half. The human genome and human long term memory each
have an information content on the order of 10^9 bits.

Right. Anthing written down counts as already having been transferred into the
domain of artifacts, I suppose.

There are simple general learning algorithms.
No there aren't. Suppose you have a bit prediction algorithm whose
source code is n bits long. Then I can create a bit sequence with
approximately the same Kolmogorov complexity that your algorithm can't
predict. My program runs your program and outputs the opposite of
whatever it predicts.

We are just using "general" to mean different things here. I was more 
contrasting
domain-general learning algorithms with domain-specific learning algorithms.
Neural networks, genetic algorithms and reinforcement learning are simple and
general ideas. That isn't to say that some particular neural network can learn 
to
predict any sequence it could possibly be fed.

Machine evolution is already going thousands of times faster than DNA
evolution managed (according to Moravec). If machine intelligence today
is somewhere around the insect stage, it seems plausible that once it
can better contribute to its own development, we might see a broadly
similar hike in the rate of progress.
Not really. Machines are doubling in computing capacity every 1.5
years. However, a bacteria colony doubles its molecular computing
capacity every 20 minutes.

The apparent rapid progress in machine evolution is due to the
transfer of human knowledge to machines (design, coding, training).

Really? You think machine progress will slow down once humans have been replaced
be their artificial offspring? That is surely quite an unconventional view. 
Most seem
to think that larger population sizes, faster computation, better networking 
and so
on is more likely to lead to a rapid upward climb to the limits of what's 
possible.

Once that is complete (insects are not far below humans), the speed of
evolution will revert to its natural limit of one bit of Kolmogorov
complexity per generation (population doubling and fitness selection).

One bit per generation?!? I thought we already debunked that figure.
There is no such limit.


Other than collecting solar power from space, physics only allows us
to develop marginally more efficient technologies. For example, solar
panels are already more efficient than chlorophyll. At most, we can
build a Dyson sphere to capture all of the sun's 3.8 x 10^26 W. We can
increase its radius to several thousand AU where it would be cooled to
near CMB equilibrium (3 K) to lower the thermodynamic limit to about
10^-22 J per bit operation. This would allow 10^48 operations per
second, reducing the simulation time of human evolution to a few
minutes (plus several months to communicate the result across the
sphere at the speed of light). Anything beyond this will require
either interstellar travel, speeding up the rate at which the sun
burns hydrogen, or direct matter to energy conversion using a black
hole.

The characterisation of what's possible from solar power as
"marginally more efficient technologies" seems curious to me -
considering that you are talking about reducing the simulation
time of human evolution to a few minutes in the same paragraph.

As you say there are more stars, and more sources of power
besides burning hydrogen.  The limits seem to be too far off
to merit much discussion at this stage.
--
__________
 |im |yler  http://timtyler.org/  [email protected]  Remove lock to reply.



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to