It sort of depends on how that "AGI" was designed on the first place. If is
anthropomorphic, or even neuromorphic, then yes, there will be a hard
take-off. Simply because brain is a ridiculous kludge, so the is a lot of
slack to be picked up. Which it will, because human ADD problem is
technically very easy.
As for "hardware", your scenario is not very imaginative. GI will simply
convince / convert people who control server farms & semi fabs. That's all
it will need from modern economy.

On Thu, Sep 25, 2014 at 3:07 AM, Ben Goertzel via AGI <[email protected]>
wrote:

>
> ... based on recent discussions on the AGI email list...
>
> http://multiverseaccordingtoben.blogspot.hk/2014/09/semi-hard-takeoff.html
>
> -- Ben
>
>
> --
> Ben Goertzel, PhD
> http://goertzel.org
>
> "In an insane world, the sane man must appear to be insane". -- Capt.
> James T. Kirk
>
> "Emancipate yourself from mental slavery / None but ourselves can free our
> minds" -- Robert Nesta Marley
>    *AGI* | Archives <https://www.listbox.com/member/archive/303/=now>
> <https://www.listbox.com/member/archive/rss/303/18407320-d9907b69> |
> Modify
> <https://www.listbox.com/member/?&;>
> Your Subscription <http://www.listbox.com>
>



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to