On Mon, Feb 24, 2020, 12:00 PM James Bowery <[email protected]> wrote:

>
> "The singularity" is a joke Heinz von Foerster played on Science magazine
> <https://www.researchgate.net/publication/9785147_Doomsday_Friday_13_November_AD_2026_At_this_date_human_population_will_approach_infinity_if_it_grows_as_it_has_grown_in_the_last_two_millenia>
> .
>

The world population will go to infinity on Friday, Nov. 13, 2026, plus or
minus 5.5 years. That was the 1960 prediction based on extrapolating
population growth up through 1958. Our future is not starvation,  but
rather being squeezed to death.

Of course we know now that people started choosing to have fewer children.
And even if we didn't, we knew then there are other biological limitations
besides food supply.

Projections of the Singularity are based on extrapolating Moore's Law,
which BTW is ending. Von Foerster's paper is a warning not to read too much
into extrapolating trends, even though it is still the most reliable way to
predict the future. He points out that going the other way, the world
population was one 200 billion years ago.

I know there are other supporting arguments, just like there are for every
theory, no matter how bizarre. Once AGI surpasses human level intelligence,
all bets are off. Never mind that computers can already think a billion
times faster than you with no mistakes. That doesn't count.

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tab97f0f82cc3442a-Mb04b80984c8287ba4d222f61
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to