On 04 Feb 2012, at 17:38, meekerdb wrote:

On 2/4/2012 1:17 AM, Bruno Marchal wrote:

The emotion of your laptot is unknown, and unmanifested, because your laptop has no deep persistant self-reference ability to share with you. We want a slave, and would be anxious in front of a machine taking too much independence.

Bruno

Yes, that's exactly why John McCarthy wrote that we should not provide AI programs with self-reflection and emotions, because it would create ethical problems in using them.

He is right. But doing babies does already the same, and for economical reason we will often get some rewards from letting some degree of autonomy in machines. The time needed for hand made machine descendent to be as free as us and as entangled as us in the local computational histories, we will already be machines ourselves. Things will be different. The shadow of the measure problem solution indicates that we might, in some sense, be already there. I'm not sure.


Bruno


http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to