Despite the fact that it seems to lack a single unified consciousness
the world of humans and their devices behaves as if it is both vastly
more intelligent and vastly more powerful than any unassisted
individual human. If you could build a machine that ran a planet all
by itself just as well as 6.7 billion people can, doing all the things
that people do as fast as people do them, then that would have to
qualify as a superintelligent AI even if you can envisage that with a
little tweaking it could be truly godlike.

The same considerations apply to me in relation to the world as apply
to an ant relative to a human or to humanity relative to a vastly
greater AI (vastly greater than humanity, not just vastly greater than
a human). If the world decided to crush me there is nothing I could do
about it, no matter how strong or fast or smart I am. As it happens,
the world is mostly indifferent to me and some parts of it will
destroy me instantly if I get in their way: if I walk into traffic
only a few metres from where I am sitting. But even if it wanted to
help me there could be problems: if the world decided it wanted to
cater to my every command I might request paperclips and it might set
about turning everything into paperclip factories, or if it wanted to
make me happy it might forcibly implant electrodes in my brain. And
yet, I feel quite safe living with this very powerful, very
intelligent, potentially very dangerous entity all around me. Should I
worry more as the world's population and technological capabilities
increase further, rendering me even weaker and more insignificant in
comparison?



--
Stathis Papaioannou

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&id_secret=22112602-8dbf63

Reply via email to