On 16 Apr 2010, at 19:07, Brent Meeker wrote:


I think intelligence in the context of a particular world requires acting within that world. Humans learn language starting with ostensive definition: (pointing) "There that's a chair. Sit in it. That's what it's for. Move it where you want to sit." An AI that was given all the books in the world to learn might very well learn something, but it would have a different kind of intelligence than human because it developed and functions in a different context. For an AI to develop human like intelligence I think it would need to be in something like a robot, something capable of acting in the world.


I agree with you. Unless you mean by 'world' a necessarily physical or material world. The 'robot' cannot distinguish a physical world (if that exists primarily) from a virtual world, nor from an arithmetical world. But I follow you that an intelligence may have to follow a long/deep computational history to acquire some skills.

We *can* program a machine with the instruction (roughly described) by "help yourself", and such a program may succeed in developing intelligence, but it may take a very long time. Once done, it can be copied, like 'nature' does all the time. In that way evolution can be sped up, and the embryogenesis does that by "simulating" the phylogenesis in part.

For a platonist, AI research can be compared to fishing. The 'intelligent entities' are already 'there', and we may isolated by filtring technic (like genetic programming, virtual evolution, or more abstract technics). Initial intelligence can take time, but intelligence (and consciousness) are, for the programs having them, self-speeding up.

Do you agree that the nature of the base environment(s) is irrelevant for the development of intelligence? It depends only on mathematical truth like "from brain's state A the history-measure of Brent's brain relative states B with Brent asserting "we need primary matter" is bigger than the history measure where Brent asserts "we don't".

The measure is mathematical, but not arithmetical, although it refers only to number relations. But then "experiences" are epistemological, not ontological.

Bruno




http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to