Pei,

I agree that evolution and intelligence are different forms of adaptation.
But I think they have a subtle interrelationship that you are not
acknowledging.

Rather, I think that

* evolution and intelligence are two different things, but sometimes they
may both be aspects of the same system
* localized evolutionary processes may be used by intelligent systems to
achieve specific goals
* intelligence may accelerate evolution via the Baldwin effect

Intelligence, to me, is the ability to achieve complex goals in complex
environments.  If we add your requirement of "finite resources" then we
obtain "the ability to achieve complex goals in complex environments using
finite resources" which is what I call "efficient intelligence."

Evolution has to do with the progressive self-modification of complex
systems, in such a way that different parts of the system continually change
so as to ensure their own survival and increase the amount of emergent
pattern between themselves and other parts of the system.

Evolution can be understood to have implicit goals, which shift over time;
but this allows us to view evolution as a kind of intelligence.

Edelman, in Neural Darwinism and other books, has explained how
neurodynamics may be viewed as an evolutionary process.  I enlarged upon
this in my 1993 book "The Evolving Mind."

In Novamente, there are two levels of evolutionary process:

* implicit evolution of the whole network of relationships that is the whole
"mind-network"
* explicit evolution of predicates and procedures using the BOA
evolutionary-learning algorithm

The former kind of evolution is probably a necessary aspect of intelligence;
the latter is not.

NARS will clearly have the former kind of evolution -- implicit evolution --

My issue with NARS isn't that it doesn't have explicit evolutionary learning
in it -- I see that, unlike implicit evolution, as an optional part of an
AGI system.

My issue with NARS is that it lacks any global-optimization-like method for
learning large predicates and procedures (large "compound terms" in NARS
language). All the techniques in NARS are local, incremental techniques for
building predicates/procedures from others, in other words they're
essentially "greedy" techniques in computer science terms.  I think some
kind of global optimization component is also necessary -- whether it's
explicitly evolutionary or not.

You may argue that you'll use NARS reasoning to learn control heuristics
that will act within NARS to build appropriate, large compound terms.  But
there's a chicken-and-egg problem here, because these control heuristics are
themselves large compound terms.  You may argue that this chicken-and-egg
problem will be solved via feedback -- a little reasoning leading to the
learning of moderately clever inference control heuristics, which help
reasoning learn slightly cleverer inference control heuristics, etc.  I
can't say this is a fundamentally ill-founded argument -- I think this kind
of feedback *will* exist -- but my suspicion is that the feedback loop will
simply never get off the ground unless there is some global-optimization-ish
method inserted to complement the "greedy heuristics" that are the NARS FOI
and HOI reasoning rules.

-- Ben

-----Original Message-----
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
Behalf Of [EMAIL PROTECTED]
Sent: Tuesday, September 21, 2004 12:51 PM
To: [EMAIL PROTECTED]
Subject: RE: [agi] proposal: Sensory Front-End



> In NARS, as I understand it, these heuristics will have to be learned via
> NARS higher-order inference applied to Implication relationships and
> compound terms related to inference-control primitives and perception and
> action primitives.  But I'm not confident that NARS contains any
mechanisms
> adequate to FORM the right compound terms, which may be large and may
> not be easily built up from their components in an incremental way (note
that
> evolutionary learning doesn't need to build things up in an incremental
way,
> whereas the NARS inference rules do, insofar as I understand them).

Yes, in NARS everything is reasoning. Clearly, evolutionary computing works
better on certain tasks, but to me, "intelligence" and "evolution" are two
quite
different forms of adaptation, and each works under certain assumptions on
certain things. For theoretical and practical reasons, I won't mix them
together.
At the current stage, the integrity and consistency of the system are more
important than its problem-solving power. I'll push the core technique of
NARS
as far as possible, even though I know that it is not the best for every
problem.

Accroding to my current plan, evolution will get into the picture at a
future time,
when I have a whole poputation of NARS, each using the same logic, but with
different "personality parameters" and innate knowledge. The evolution
process
can generate new generations of NARS which works better than their parents.
Even after this, "intelligence" and "evolution" are still different (though
related)
--- the former works within a system, the latter works within a species
(with
generations of systems).

Pei


-------
To unsubscribe, change your address, or temporarily deactivate your
subscription,
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]



-------
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to