One would think that Hawking was way less anthropomorphic? Expecting a
superintelligent entity to behave like the worst Roman emperor? If you
have to be anthropomorphic then why not expect them to behave way better
than the brightest and most empathic human being? Some of us even
stopped eating meat for ethical reasons and I guess that it is safe to
assume that an advanced AGI will not fight over resources or even atoms
in this universe of abundance.
There just is no good reason for an AGI to obsolete humanity against our
will. In fact there are so many productive and cooperative options from
coexistence to merging to teaching us about the purpose of our existence
and helping us to become better beings ...
On 05/03/2014 04:57 AM, Alan Grimes via AGI wrote:
http://guardianlv.com/2014/05/stephen-hawking-tells-truth-on-ai-perhaps-worst-thing-to-happen-to-humans/
-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription:
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com