Re: [agi] A thought.

2003-02-07 Thread Brad Wyble
Philip, I can understand the brain structure we see in intelligent animals would emerge from a process of biological evolution where no conscious design is involved (ie. specialised non conscious functions emerge first, generalised processes emerge later), but why should AGI design

RE: [agi] A thought.

2003-02-06 Thread Ben Goertzel
S. Yudkowsky Sent: Wednesday, February 05, 2003 11:42 PM To: [EMAIL PROTECTED] Subject: Re: [agi] A thought. James Rogers wrote: Just as there is no general environment, there is no general intelligence. A mind must be matched to its environment. Huh? The point of a generally

Re: [agi] A thought.

2003-02-06 Thread Brad Wyble
3) Any successful AGI system is also going to have components in two other categories: a) specialized-intelligence components that solve particular problems in ways having little or nothing to do with truly general intelligence capability b) specialized-intelligence components that are

RE: [agi] A thought.

2003-02-06 Thread Ben Goertzel
3) Any successful AGI system is also going to have components in two other categories: a) specialized-intelligence components that solve particular problems in ways having little or nothing to do with truly general intelligence capability b) specialized-intelligence components

Re: [agi] A thought.

2003-02-06 Thread Philip Sutton
Brad, But I think that the further down you go towards the primitive level, the more and more specialized everything is. While they all use neurons, the anatomy, and neurophysiology of low level brain areas are so drastically different from one another as to be conceptually distinct. I

Re: [agi] A thought.

2003-02-05 Thread Brad Wyble
I've just joined this list, it's my first post. Greetings all. 1.5 line summary of me: AI enthusiast since 10 yrs old, CS undergrad degree, 3 mo.'s from finishing psych/neuroscience PHD. Mike, you are correct that an AI must be matched to its enviornment. It's likely that a sentience

Re: [agi] A thought.

2003-02-05 Thread SMcClenahan
I tried to discuss this on SL4. See the thread at http://www.sl4.org/archive/0212/5995.html What you call complex environment is what I called reality. Reality for a digital computer running an AGI program is different for (intelligent) humans living in meatspace, and I would even go as far as