This is something that fits in with what I wrote here some time ago
about insect-level AI taking over from us.
A system with AGI doesn't have to be all that intelligent for it to be
extremely useful. Today we cannot build a remotely controlled spider
that could survive in Nature. That little intelligence a spider has is
the GI it needs to take on the challenges of surviving. If we have
something similar, say spider level AGI then that's good enough to fully
automatize our entire economy with. The reason why you can't replace all
factory worker with machines is due to a lack of even a minimal amount
of AGI.
So, I think insect-level AGI will cause a rapid transition to a machine
civilization. This will lead to a new biology of machines with insect
level intelligence ending up wiping out all life on Earth due to
pollution, similar to the great oxygenation event:
https://en.wikipedia.org/wiki/Great_Oxidation_Event
And as I pointed out earlier, I think this is a universal phenomena that
all intelligent life is subject to. The whole point of being intelligent
is to let as much of the work be done for you by entities that are
dumber than you. But in that process that leads to faster and faster
economic growth, its inevitable that at some point you are going to
crate autonomous systems that will grow exponentially. The point where
the transition to artificial life starts is going to be close to the
minimum intelligence level needed for exponential growth.
If you make it hotter and hotter in some closed space, a fire will break
out, this is going to happen close to the minimum required temperature
for ignition, not at some extremely high value for the temperature.
Nature shows us that the minimum amount of intelligence required for
efficient self-maintenance and reproduction that yields exponential
growth is very low.
Saibal
On 08-09-2022 14:09, John Clark wrote:
This is an interview of the great computer programmer John Carmack, he
thinks the time when computers can do everything, not just some
things, as good or better than humans is much closer than most people
believe, he thinks there is a 60% chance it will happen by 2030. Like
me Carmack is much more interested in intelligence than consciousness
and has no interest in the "philosophical zombie" argument. As far as
the future history of the human race is concerned the following
quotation is particularly relevant:
"___It seems to me this is the highest leverage moment for a single
individual potentially_ _in the history of the world._ [...] _I am
not a mad man in saying that the code for artificial General
intelligence is going to be tens of thousands of lines of code, not
millions of lines of code. This is code that conceivably one
individual could write, unliker writing a new web browser or operating
system._"
The code for AGI will be simple [1]
John K Clark See what's on my new list at Extropolis [2]
b30
--
You received this message because you are subscribed to the Google
Groups "Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send
an email to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/everything-list/CAJPayv3ZEbXXVjs803%3Dutjc2pvkCgpZGA%2Bad_OWBhue-5kxDJQ%40mail.gmail.com
[3].
Links:
------
[1] https://www.youtube.com/watch?v=xLi83prR5fg
[2] https://groups.google.com/g/extropolis
[3]
https://groups.google.com/d/msgid/everything-list/CAJPayv3ZEbXXVjs803%3Dutjc2pvkCgpZGA%2Bad_OWBhue-5kxDJQ%40mail.gmail.com?utm_medium=email&utm_source=footer
--
You received this message because you are subscribed to the Google Groups
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to everything-list+unsubscr...@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/everything-list/d4b54074fe283e5c198ff6a6d709b143%40zonnet.nl.