On Tue, Mar 17, 2015 at 09:33:22AM +0100, Calum Chace via AGI wrote:
> Steve
> 
> I sympathise with your very understandable preference not to be targeted by
> anti-AI crazies!
> 
> What do you think is the best way to try and shape the growing public
> debate about AGI?  Following Bostrom's book, and the comments by Hawking,
> Musk and Gates, a fair proportion of the general public is now aware that
> AGI might arrive in the medium term, and that it will have a very big
> impact.
> 
> Some AI researchers seem to be responding by saying, "Don't worry, it can't
> happen for centuries, if ever".  No doubt some of them genuinely believe
> that, but I wonder whether some are saying it in the (forlorn?) hope the
> debate will go away. It won't.  In fact I suspect that the new Avengers
> movie will kick it up a level.
> 
> Others are saying, "Don't worry, AGI cannot and will not harm humans."  To
> my mind (and I realise I may be in a small minority here on this) that is
> hard to be certain about - as Bostrom demonstrated.
> 
> Yet others are saying, "AI researcher will solve the problem long before
> AGI arrives, and it's best not to worry everyone else in the meantime."
>  That seems a dangerous approach to me.  If the public ever feels (rightly
> or wrongly) that things have been hidden from them, they will react badly.
> 
> But I do definitely sympathise with the desire not to be targeted by
> crazies, or to be vilified by journalists who have half-understood the
> situation!
> 

[...]

> >> > -------------------------------------------
> >> > AGI
> >> > Archives: https://www.listbox.com/member/archive/303/=now

[....]


I would suggest reading J.Pitrat's december 2014 blog entry on that subject.
J.Pitrat is probably not subscribing to that list, i
so I am blind-carbon-copying him.

http://bootstrappingartificialintelligence.fr/WordPress3/2014/12/not-developing-an-advanced-artificial-intelligence-could-spell-the-end-of-the-human-race/

He is explaining that

 "Not developing an advanced artificial intelligence
  could spell the end of the human race"

and I believe he has a point. Of course AGI researchers should be careful.

Regards

-- 
Basile Starynkevitch   http://starynkevitch.net/Basile/



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to