J Storrs Hall, PhD wrote:
On Thursday 17 April 2008 04:47:41 am, Richard Loosemore wrote:
If you could build a (completely safe, I am assuming) system that could think in *every* way as powerfully as a human being, what would you teach it to become:

1) A travel Agent.

2) A medical researcher who could learn to be the world's leading specialist in a particular field,...

Travel agent. Better yet, housemaid. I can teach it to become these things because I know how to do them. Early AGIs will be more likely to be successful at these things because they're easier to learn.

Yes, that shows deep analysis and insight into the problem.

I can just see the first AGI corporation now, having spent a hundred million dollars in development money, deciding to make a profit by selling a housemaid robot that will replace the cheap, almost-slave labor coming across the border from Mexico.

Of course, it would not occur to that company to develop their systems just a litle more and get the AGI to do high-value intellectual work.




Richard Loosemore

-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=101455710-f059c4
Powered by Listbox: http://www.listbox.com

Reply via email to