I'm struggling with processing so much waffle.

Take an article, or discussion, anyone, say this one below for example, and try 
and apply it coherently to the AGI space. Then add in the complexities of the 
Einstein-Rosen bridge (wormholes, blackholes, etc.) and have some fun with it.

My 2 dots' worth is that if it ain't a wholly-independent system, it ain't a 
singularity. If I understood some of it correctly, Kurzweil may well have tried 
to share the notion that when critical mass in a system assumes control, it 
could theoretically find a way to steam on all by itself. His contention was 
this would become the case for intelligence-based computational platforms.

https://www.einstein-online.info/en/spotlight/singularities/
Spacetime singularities « 
Einstein-Online<https://www.einstein-online.info/en/spotlight/singularities/>
Perhaps the most drastic consequence of Einstein’s description of gravity in 
terms of curved spacetime geometry in the framework of his general theory of 
relativity is the possibility that space and time may exhibit “holes” or 
“edges”: spacetime singularities.. Over the edge
www.einstein-online.info


________________________________
From: Matt Mahoney <[email protected]>
Sent: Wednesday, 10 March 2021 16:11
To: AGI <[email protected]>
Subject: Re: [agi] Patterns of Cognition



On Tue, Mar 9, 2021, 4:12 PM WriterOfMinds 
<[email protected]<mailto:[email protected]>> wrote:
Then perhaps defining your terms, and maintaining awareness of how other people 
define them, would be helpful? I'm pretty sure we've had the discussion about 
the popular/futurist definition of "singularity" being different from the 
mathematical definition before, and you persist in acting as if other people 
must be using the mathematical definition.

It seem that Good and Vinge do use "singularity" in the mathematical sense, 
although that actually prevents us from predicting one, as Vinge calls it an 
"event horizon on the future". Good doesn't say what happens after the 
"intelligence explosion". Kutzweil projects a faster than exponential growth in 
computing power until the 2040's when computers surpass brains, but makes no 
prediction afterwards as to whether it will slow down or grow forever or grow 
hyperbolically to a point.
https://en.m.wikipedia.org/wiki/Technological_singularity

If it does slow down, as I argue it eventually must in a finite universe, what 
should we call it? How about the inflection point in Moore's Law. We might have 
already reached it. Clock speeds stalled in 2010. Transistors can't be smaller 
than the spacing between dopant atoms, a few nm, and we are close to that now. 
We could reduce power consumption by a factor of a billion using 
nanotechnology, but can we develop it fast enough to keep doubling global 
computing power every 1.5 years?

Global energy production is 15 TW, or 2 KW per person. Human metabolism is 5% 
of that. The biosphere converts sunlight to food using 500 TW out of 90,000 TW 
available the Earth's surface or 160,000 TW in the stratosphere or low Earth 
orbit, or 384 trillion TW if we build a Dyson sphere. That would give us 10^48 
irreversible bit operations per second at the Landauer limit at the CMB 
temperature of 3K, enough to simulate 3 billion years of evolution on 10^37 
bits of DNA in a few minutes on a Dyson sphere with radius 10,000 AU. A naive 
projection of Moore's Law says that will happen around 2160, after 
nanotechnology displaces DNA based life in the 2080's. Actually building the 
sphere is possible because the sun produces enough energy to lift all of 
Earth's mass into space in about a week.

After that our options are interstellar travel or speeding up the Sun's output 
using a black hole. Ultimately we are confronted with a finite 10^53 Kg 
universe that can only support 10^120 quantum operations and 10^90 bit writes. 
At what point do we call it a singularity?

Artificial General Intelligence List<https://agi.topicbox.com/latest> / AGI / 
see discussions<https://agi.topicbox.com/groups/agi> + 
participants<https://agi.topicbox.com/groups/agi/members> + delivery 
options<https://agi.topicbox.com/groups/agi/subscription> 
Permalink<https://agi.topicbox.com/groups/agi/Ta5ed5d0d0e4de96d-M79cf5b7f853361f839fceba2>

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Ta5ed5d0d0e4de96d-M44187a7a3769c4d5df2d10b3
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to