Even if the promises were somewhat misleading nobody would probably complain after an intelligence explosion. Just like nobody would complain if you promised them a typewriter but gave them a desktop and a laser printer (and a 100 fold IQ upgrade) instead? And if we all die in the process then clearly the goal of "benevolent (superintelligent) AGI" was not reached ... ;-)

You could even make the goal explicit: "benevolent AGI that will give 5kg of gold to every donator". Maybe you would reach a broader audience (that never got in contact with Singularitarian/Cosmist memes before) that way.

Is it unethical to lure a cat into a box in order to take her to the vet? You could not explain the concept of a veterinarian to her anyway.

But maybe Ben is afraid of triggering some Luddite shitstorm in the media?

-- jc

On 04/02/2013 03:26 AM, Florent Berthet wrote:
Most Kickstarter's success stories ...


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to