On 5/15/07, Mike Tintner <[EMAIL PROTECTED]> wrote:
I am suggesting that there are two main types of intelligence - and humans
have both.
"Simulating the human mind" isn't a definition of either of those types, or
intelligence, period.

Sorry for the misunderstanding.

The two main types of intelligence have long been given names by mainstream
pyschology -
"convergent" or "crystallised"  .vs "divergent" or "fluid" intelligence. And
these two types also seem very clearly to me to
identify and be more or less the same as the distinction between AI and AGI.
There is a very long tradition here, and the parallelism seems obvious.

Can you give me a reference? I'm not familiar with this distinction.

But neither of these types have yet been given proper, adequate definitions
by Psychology, and nor indeed has "intelligence" generally. That, I am
suggesting, is the task.

For the philosophy of AI - and this IS a discussion of philosophy - to
ignore Psychology and human intelligence, and the very extensive work
already done here, including on creativity - doesn't seem v. wise, given
that AI/AGI still haven't got to square one in the attempt either to emulate
or to satisfactorily define human-level "fluid", "adaptive" intelligence.

I don't think in this discussion anyone has suggested to ignore
psychology. The problem is in which level we want to follow psychology
when doing AGI.

Using forgetting as an example, do we want an AGI system to have the
exact forgetting rate as an average human being? Or we only want it to
have the cognitive function of forgetting? Or we judge forgetting as
an undesired human weakness, and let the system to remember
everything?

Different opinions here come from the different definitions of
"intelligence" discussed in my paper.

Pei

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=fabd7936

Reply via email to