# Re: The physical limits of computation

```
On 1/21/2024 5:15 AM, John Clark wrote:
```
On Sat, Jan 20, 2024 at 7:27 PM Brent Meeker <meekerbr...@gmail.com> wrote:
```
//
/> The problem with this is that information, like complexity, has
no physically definite operational meaning.  You can't go into the
lab and ask what's the information content of "this"./

```
In 1948 Claude Shannongave us an operational definition of information, the amount of uncertainty reduced by a message, and it is measured in bits.
And Shannon's definition requires that the the possible messages be predefined.
```
```
There is also a thermodynamic definition for information, the amount of entropy that is reduced in a given system, and it is also measured in bits. The two definitions work harmoniously together.
Again, the thermodynamic definition depends on what variables will be ignored.
```
```
So if you know the encoding algorithm you can always determine how much information something has, or at least the maximum amount of information a message has the potential to hold. For example, we know from experiment that the human genome contains 3 billion base pairs, and we know there are 4 bases, so each base can represent 2 bits and there are 8 bits per byte; therefore the entire human genome only has the capacity to hold 750 MB of information; that's about the amount of information you could fit on an old-fashioned CD, not a DVD, just a CD. The true number must be considerably less than that because the human genome contains a huge amount of redundancy, 750 MB is just the upper bound. Incidentally that's why I now think the singularity is likely to happen sometime within the next 5 years, one year ago, before it became obvious that a computer had passed the Turing Test, I would've said 20 to 30 years.
A good example, proving my point.  A lot, maybe even a majority, of the the human genome is junk and doesn't code for anything and you can only know this by seeing how it interacts in a cell.  It's "information" is context dependent, not inherent.
```

```
```
```
I think we can be as certain as we can be certain of anything that it should be possible to build a seed AI that can grow from knowing nothing to being super-intelligent, and the recipe for building such a thing must be less than 750 MB, a *LOT* less.
```It takes a womb and all that is needed to support a womb.

```
After all Albert Einstein went from understanding precisely nothing in 1879 to being the first man to understand General Relativity in 1915,
He understood general relativity by absorbing information from Minkowski, Riemann, Maxwell, Lorentz, and Grossman...not just from his genome.
```
Brent

```
and the human genome only contains 750 megs of information, and yet that is enough information to construct an entire human being not just a brain. So whatever algorithm Einstein used to extract information from his environment was, it must have been pretty simple, much much less than 750 megs. That's why I've been saying for years that super-intelligence could be achieved just by scaling things up, no new scientific discovery was needed, just better engineering; although I admit I was surprised how little scaling up turned out to be required.
```
```
John K Clark    See what's on my new list at Extropolis <https://groups.google.com/g/extropolis>
```98n

--
```
You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to everything-list+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/everything-list/CAJPayv31LubK_sNn6tRspUwfjRqvOHtf2dTDcr%2B96xBAhQmkRQ%40mail.gmail.com <https://groups.google.com/d/msgid/everything-list/CAJPayv31LubK_sNn6tRspUwfjRqvOHtf2dTDcr%2B96xBAhQmkRQ%40mail.gmail.com?utm_medium=email&utm_source=footer>.
```
--
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email