# Re: The physical limits of computation

`On Sat, Jan 20, 2024 at 7:27 PM Brent Meeker <meekerbr...@gmail.com> wrote:`
```
* > The problem with this is that information, like complexity, has no
> physically definite operational meaning.  You can't go into the lab and ask
> what's the information content of "this".*
>

In 1948 Claude Shannon gave us an operational definition of information,
the amount of uncertainty reduced by a message, and it is measured in bits.
There is also a thermodynamic definition for information, the amount of
entropy that is reduced in a given system, and it is also measured in bits.
The two definitions work harmoniously together.

So if you know the encoding algorithm you can always determine how much
information something has, or at least the maximum amount of information a
message has the potential to hold. For example, we know from experiment
that the human genome contains 3 billion base pairs, and we know there are
4 bases, so each base can represent 2 bits and there are 8 bits per byte;
therefore the entire human genome only has the capacity to hold 750 MB of
information; that's about the amount of information you could fit on an
old-fashioned CD, not a DVD, just a CD. The true number must be
considerably less than that because the human genome contains a huge amount
of redundancy, 750 MB is just the upper bound. Incidentally that's why I
now think the singularity is likely to happen sometime within the next 5
years, one year ago, before it became obvious that a computer had passed
the Turing Test, I would've said 20 to 30 years.

I think we can be as certain as we can be certain of anything that it
should be possible to build a seed AI that can grow from knowing nothing to
being super-intelligent, and the recipe for building such a thing must be
less than 750 MB, a *LOT* less. After all Albert Einstein went from
understanding precisely nothing in 1879 to being the first man to
understand General Relativity in 1915, and the human genome only contains
750 megs of information, and yet that is enough information to construct an
entire human being not just a brain. So whatever algorithm Einstein used to
extract information from his environment was, it must have been pretty
simple, much much less than 750 megs. That's why I've been saying for years
that super-intelligence could be achieved just by scaling things up, no new
scientific discovery was needed, just better engineering; although I admit
I was surprised how little scaling up turned out to be required.

John K Clark    See what's on my new list at  Extropolis