AGI will be slow takeoff because:

1. Fast takeoff implies that AGI crosses the threshold of human
intelligence and starts improving itself. But no such threshold
exists. It depends on how you measure intelligence. Computers are
already a billion times smarter on tests of arithmetic and short term
memory. Still, computers are improving on every test we devise.

2. Moore's law will be slowed because we can't make transistors
smaller than atoms. Already they are at the limit of spacing between
silicon doping atoms. Clock speeds stalled in 2010. Reducing power
consumption to the level of the human brain will require
nanotechnology, moving atoms instead of electrons. We don't know when
this technology will be developed, but at the rate of Moore's law, it
will take a century of doubling world computing power every 2-3 years
to match the 10^37 bits of DNA storage and 10^31 amino acid
transcription operations per second of the biosphere. Quantum
computing can't save us because neural networks are not time
reversible.

3. Population is declining in most of the wealthier countries where
AGI development is occurring.

On Mon, Mar 25, 2024 at 3:40 PM Alan Grimes via AGI
<[email protected]> wrote:
> 
> Ok, we have been in para-singularity mode for about a year now. What are
> the next steps?
> 
> I see two possibilities:
> 
> A. AGI cometh.  AGI is solved in an unambiguous way.
> 
> B. We enter a "takeoff" scenario where humans are removed from the
> upgrade cycle of AI hardware and software. We would start getting better
> hardware platforms and AI tools at some non-zero rate with non-zero
> improvements without doing anything... How far this could procede
> without achieving AGI as a side-effect is unclear, as our human general
> intelligence appears to be an effect of the evolution-based improvement
> process that created us. At some point even a relatively blind
> optimization process would discover the principles required for
> consciousness et al...
> 
> In any event it's time to get this party started... We are teetering on
> the edge of socioeconomic collapse and probably won't get another chance
> at this within my lifetime. =|
> 
> --
> You can't out-crazy a Democrat.
> #EggCrisis  #BlackWinter
> White is the new Kulak.
> Powers are not rights.
> 



-- 
-- Matt Mahoney, [email protected]

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T75b708e761eaa016-Me6b6e28cb5eac6c41c375130
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to