On Tue, Sep 18, 2012 at 9:42 PM, Piaget Modeler
<[email protected]> wrote:
> Once a robot becomes sentient, it might feel enslaved. Distinct
> possibility.

A robot will only be sentient (whatever that means) if you program it
to be sentient. A robot does not have to have human goals to do
anything that a human could do. That includes modeling human goals to
predict their behavior. A program does not have to ever be angry to
predict situations that will cause people to fight.

I know this seems strange. When you see someone in a situation, you
might predict their response by thinking, "what would I do if that
were me?" But a better way to make the same prediction is to observe
what other people have done in the same situation. This method would
be far more accurate because an AI could have observed billions of
people, far more experience than any single person could ever have
over a lifetime.

> What if the robot demands a wage so that it may participate in the
> economy?

Then you update its software.


-- Matt Mahoney, [email protected]


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-c97d2393
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-2484a968
Powered by Listbox: http://www.listbox.com

Reply via email to