Philip,
I can understand the brain structure we see in intelligent animals would
emerge from a process of biological evolution where no conscious
design is involved (ie. specialised non conscious functions emerge first,
generalised processes emerge later), but why should AGI design
S. Yudkowsky
Sent: Wednesday, February 05, 2003 11:42 PM
To: [EMAIL PROTECTED]
Subject: Re: [agi] A thought.
James Rogers wrote:
Just as there is
no general environment, there is no general intelligence. A mind
must be matched to its environment.
Huh? The point of a generally
3)
Any successful AGI system is also going to have components in two other
categories:
a) specialized-intelligence components that solve particular problems in
ways having little or nothing to do with truly general intelligence
capability
b) specialized-intelligence components that are
3)
Any successful AGI system is also going to have components in two other
categories:
a) specialized-intelligence components that solve particular problems in
ways having little or nothing to do with truly general intelligence
capability
b) specialized-intelligence components
Brad,
But I think that the further down you go towards the primitive level,
the more and more specialized everything is. While they all use
neurons, the anatomy, and neurophysiology of low level brain areas are
so drastically different from one another as to be conceptually
distinct.
I
I've just joined this list, it's my first post. Greetings all.
1.5 line summary of me: AI enthusiast since 10 yrs old, CS undergrad degree, 3 mo.'s
from finishing psych/neuroscience PHD.
Mike, you are correct that an AI must be matched to its enviornment. It's likely that
a sentience
I tried to discuss this on SL4. See the thread at
http://www.sl4.org/archive/0212/5995.html
What you call complex environment is what I called reality. Reality for a
digital computer running an AGI program is different for (intelligent) humans
living in meatspace, and I would even go as far as