Matt Mahoney writes:> Just what do you want out of AGI? Something that thinks like a person or> something that does what you ask it to? I think this is an excellent question, one I do not have a clear answer to myself, even for my own use. Imagine we have an "AGI". What exactly does it do? What *should* it do? "It does whatever we tell it" is not good enough. What would we tell it to do? And no wigged-out scifi allowed; you can't say "invent molecular nanotechnology and build me a Dyson sphere" -- first, because such a vision is completely unhelpful in guiding how to get there, and second because there's no reason to think a currently-envisionable AGI would be millions of times "smarter" than all of humanity put together.
------------------------------------------- singularity Archives: http://www.listbox.com/member/archive/11983/=now RSS Feed: http://www.listbox.com/member/archive/rss/11983/ Modify Your Subscription: http://www.listbox.com/member/?member_id=4007604&id_secret=98631122-712fa4 Powered by Listbox: http://www.listbox.com