>>>

A conceptual framework starts with knowledge representation. Thus a symbol S 
refers to a persistent pattern P which is, in some way or another, a reflection 
of the agent's environment and/or a composition of other symbols. Symbols are 
related to each other in various ways. These relations (such as, "is a property 
of", "contains", "is associated with") are either given or emerge in some kind 
of self-organizing dynamic.

A causal model M is a set of symbols such that the activation of symbols 
S1...Sn are used to infer the future activation of symbol S'. The rules of 
inference are either given or emerge in some kind of self-organizing dynamic.

A conceptual framework refers to the whole set of symbols and their relations, 
which includes all causal models and rules of inference.

Such a framework is necessary for language comprehension because meaning is 
grounded in that framework. For example, the word 'flies' has at least two 
totally distinct meanings, and each is unambiguously evoked only when given the 
appropriate conceptual context, as in the classic example "time flies like an 
arrow; fruit flies like a banana."  "time" and "fruit" have very different sets 
of relations to other patterns, and these relations can in principle be 
employed to disambiguate the intended meaning of "flies" and "like".

If you think language comprehension is possible with just statistical methods, 
perhaps you can show how they would work to disambiguate the above example.
<<<



I agree with your framework but it is in my approach a part of nonlinguistic D 
which is separated from L. D and L interact only during the process of 
translation but even in this process D and L are separated.




>>>
OK, let's look at all 3 cases:

1. Primitive language *causes* reduced abstraction faculties
2. Reduced abstraction faculties *causes* primitive language
3. Primitive language and reduced abstraction faculties are merely correlated; 
neither strictly causes the other

I've been arguing for (1), saying that language and intelligence are 
inseparable (for social intelligences). The sophistication of one's language 
bounds the sophistication of one's conceptual framework. 

In (2), one must be saying with the Piraha that they are cognitively deficient 
for another reason, and their language is primitive as a result of that 
deficiency. Professor Daniel Everett, the anthropological linguist who first 
described the Piraha grammar, dismissed this possibility in his paper "Cultural 
Constraints on Grammar and Cognition in Piraha˜" (see 
http://www.eva.mpg.de/psycho/pdf/Publications_2005_PDF/Commentary_on_D.Everett_05.pdf):

"... [the idea that] the Piraha˜ are sub-
standard mentally—is easily disposed of. The source
of this collective conceptual deficit could only be ge-
netics, health, or culture. Genetics can be ruled out
because the Piraha˜ people (according to my own ob-
servations and Nimuendajú’s have long intermarried
with outsiders. In fact, they have intermarried to the
extent that no well-defined phenotype other than stat-
ure can be identified. Piraha˜s also enjoy a good and
varied diet of fish, game, nuts, legumes, and fruits, so
there seems to be no dietary basis for any inferiority.
We are left, then, with culture, and here my argument
is exactly that their grammatical differences derive
from cultural values. I am not, however, making a
claim about Piraha˜ conceptual abilities but about their
expression of certain concepts linguistically, and this
is a crucial difference."

This quote thus also addresses (3), that the language and the conceptual 
deficiency are merely correlated. Everett seems to be arguing for this point, 
that their language and conceptual abilities are both held back by their 
culture. There are questions about the dynamic between culture and language, 
but that's all speculative.

I realize this leaves the issue unresolved. I include it because I raised the 
Piraha example and it would be disingenuous of me to not mention Everett's 
interpretation.

<<<

Everett's interpretation is that culture is responsible for reduced abstraction 
facilities. I agree with this. But this does not imply your claim (1) that 
language causes the reduced facilities. The reduced number of cultural 
experiences in which abstraction is important is responsible for the reduced 
abstraction facilities.

 

>>>
Of course, but our opinions have consequences, and in debating the consequences 
we may arrive at a situation in which one of our positions appears absurd, 
contradictory, or totally improbable. That is why we debate about what is 
ultimately speculative, because sometimes we can show the falsehood of a 
position without empirical facts.

On to your example. The ability to do algebra is hardly a test of general 
intelligence, as software like Mathematica can do it. One could say that the 
ability to be *taught* how to do algebra reflects general intelligence, but 
again, that involves learning the *language* of mathematical formalism.
<<<

Your claim is that natural language understanding is sufficient for AGI. Then 
you must be able to prove that everything what AGI can is also possible by a 
system which is able to understand natural language. AGI can learn to solve x*3 
= y for arbitrary y. And AGI can do this with Mathematica or without 
Mathematica. Simply prove that a natural language understanding system must 
necessarily be able to do the same.

If this is not possible than your assumption that natural language 
understanding systems are AGI-complete does not hold.


So the documented emergence of a totally new sign language among an isolated 
deaf community is somehow not natural?  See 
http://en.wikipedia.org/wiki/Nicaraguan_Sign_Language (that's a different link 
from the last one) for an example of a sign language that developed from a 
pidgin to a creole in exactly the same way as spoken languages.

I still would say that sign language is no natural language.
Here we have different opinions and I think we must agree to disagree.



>>>
I think there is a lot more evidence for the idea that language and 
intelligence are integrated than for the idea that they're not. 
<<<
I do not think so.

>>>
I think all of the examples you've used to illustrate your points about 
language involve data transfers by dumb computers based on predetermined 
protocols. That is the narrowest domain you could possibly talk about - no 
intelligence whatsoever is required in that contrived situation.
<<<
This is wrong.
I have given the model why we have the illusion that we believe our thoughts 
are build from language. 

Once again:
Thoughts are translated into language and routed into input regions. We hear 
our own thoughts. But in fact these are only *translations* of a subset of our 
thoughts. The thoughts themselves are completely nonlinguistic. 
I have read somewhere (but cannot give the source) that it is proven that 
thoughts activate motor neurons but somehow one avoids to speak. This evidence 
supports my model that thoughts are translated into language.

If you speak then motor neurons become active. But at the same time input 
neurons become active from the sound of your voice. You have do this thousands 
of times.
This process of speaking has caused Hebbian between motor neurons and input 
neurons in your brain.

If you think, then your motor neurons become active (there is evidence for this 
point) but you avoid to speak. Because of the past  hebbian learning your input 
neurons become active as well (my assumption). 

My model explains several phenomena:

1. We hear our thoughts
2. We think with the same speed as we speak (this is not trivial!)
3. We hear our thoughts with our own voice (strong evidence for my model!)
4. We have problems to think in a very noisy and loud environment (because we 
have to listen to our thoughts)




>>>
An AGI needs to be competent (solve problems) in novel domains, which means 
learning new protocols for understanding and action. That's the crux of general 
intelligence, and I don't think you can ignore the *learning* of language, 
which you've admitted is hard.
<<<

The learning of language is no contradiction to a model where D and L is 
separated.




-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=117534816-b15a34
Powered by Listbox: http://www.listbox.com

Reply via email to