On Thursday 17 April 2008 04:47:41 am, Richard Loosemore wrote:
> If you could build a (completely safe, I am assuming) system that could 
> think in *every* way as powerfully as a human being, what would you 
> teach it to become:
> 
> 1) A travel Agent.
> 
> 2) A medical researcher who could learn to be the world's leading 
> specialist in a particular field,...

Travel agent. Better yet, housemaid. I can teach it to become these things 
because I know how to do them. Early AGIs will be more likely to be 
successful at these things because they're easier to learn. 

This is sort of like Orville Wright asking, "If I build a flying machine, 
what's the first use I'll put it to: 
1) Carrying mail.
2) A manned moon landing."

-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=101455710-f059c4
Powered by Listbox: http://www.listbox.com

Reply via email to