On 4/24/07, Mike Tintner <[EMAIL PROTECTED]> wrote:

 Well we agree where we disagree.

I'm very confident that AGI can't be achieved but by following
crudely evolutionary and developmental paths. The broad reason is that
brain, body, intelligence  and the set, or psychoeconomy of activities of
the animal evolve in interrelationship with each other. All the activities
that animals undertake are extremely problematic,  and became ever more
complex and problematic as they evolved - and require ever complex physical
and mental structures to tackle them.



Yes, that is how things evolved in nature.  That doesn't mean it's the only
way things can be.

Airplanes don't fly like birds, etc. etc.

You seem to be making a more sophisticated version of the GOFAI mistake of
thinking intelligence could be just symbolic and rational - and you can jump
straight to the top of evolved intelligence.



No, I absolutely don't think that intelligence can be "just symbolic" -- and
I don't think that given plausible computational resources, intelligence can
be "just rational."

"Purely symbolic/rational" versus "animal-like" are not the only ways to
approach AGI...





But I take away from this one personal challenge, which is that it clearly
needs to be properly explained that a) language rests at the top of a giant
picture tree of sign systems in the mind  - without the rest of
which language does not  "make sense" and you "can't see what you are
talking about" (& there's no choice about that - that's the way the human
mind works - and any equally successful mind will have to work), and b)
language also rests on a complex set of physical motor and manipulative
systems - and you can't grasp the sense of language, if you can't physically
grasp the world. Does this last area - the multilevelled nature of language
- interest you?



I already understand all those points and have done so for a long time.
They are statements about human psychology.  Why do you think that closely
humanlike intelilgence is the only kind?

As it happens my own AGI project does include embodiment (albeit, at the
moment, simulated embodiment in a 3D sim world) and aims to ground language
in perceptions and actions.  However, it doesn't aim to do so in a slavishly
humanlike way, and also has room for more explicit logic-like
representations.

"There are more approaches to AGI than are dreamt of in your philosophy"
;-)

-- Ben G

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&user_secret=8eb45b07

Reply via email to