Anastasios, You are making the error of arguing this logically, when at best this is illogical, and at worst is diabolical as Alan explained.
Continuing... On Thu, Jan 3, 2013 at 5:18 PM, Anastasios Tsiolakidis < [email protected]> wrote: > On Fri, Jan 4, 2013 at 12:01 AM, Steve Richfield < > [email protected]> wrote: > >> WAKE UP >> >> > I think those Terminator academics are wasting time and money as usual. I see this as being MUCH more dangerous than simply wasting time and money. > I have previously given some qualitative and quantitative scenarios that, > to me at least, prove that AGI and the singularity are completely > unpredictable and incomprehensible events that can go any which way. Of course. > In short our fate will be as dependent on intelligent machines as the > earth biosphere's fate is dependent on our little retarded brains. I just > happen to prefer the former :) And I'd never forgive myself if, instead, > somebody from the Bush family dragged me and my species down. Call me > cautiously optimistic then :) > I don't think there is any near-term danger from AGI, mostly because I don't see it happening in the near future. However, I see a LOT of near-term danger from people who accept the possibility of a near-term AGI. Steve ------------------------------------------- AGI Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424 Modify Your Subscription: https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657 Powered by Listbox: http://www.listbox.com
