Nanograte Knowledge Technologies wrote:
Alan, seriously? Are you utterly convinced the learning system for a machine would only comprise a single level of complexity?

Yes.

First, disregard everything you need to generate machine code instructions to your CPU and GPU, not relevant; don't care.

Ok, so you are programming an AGI.

General intelligence means that you don't know what it will be exposed to or what it will need to learn to do.

Therefore it needs to be programmed with the ability to detect when it might be a good idea to try a more complex organization, and then the basic mechanisms required to implement that. (it also, therefore, needs a complementary mechanism that detects when extra layers of organization are redundant and remove them).

The open question is what is the minimal structure that the system needs to be "primed with" in order to genarate, in an android like embodyment, a human-like consciousness. Or, more generally, what are the basic rules for priming an agent for efficient bootstrap without impairing generality.


--
Please report bounces from this address to [email protected]

Powers are not rights.


------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/Tf97c751029c2e4db-M83377debaa379862676a017d
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to