The primitive terms arent random, just some of the structure of it.

English standard does Sub VB Ob, while others do 
VB Subj Ob 
or another manner, as long as they are known and roughly consistently used, the 
 actual choice coudl well be random there and not matter, 
but a 'concept' of a dog in any language is roughly the same, based on what we 
share when we see, hear, smell, and interact with the concept.

Everything is based on these primitives of experiencing the world, 
so I am using english, but modeling my knowledge base terms on these 
experiences.

James


Richard Loosemore <[EMAIL PROTECTED]> wrote: Eric Baum wrote:
>>>> I don't think the proofs depend on any special assumptions about
>>> the > nature of learning.
>>>
>>> I beg to differ.  IIRC the sense of "learning" they require is
>>> induction over example sentences.  They exclude the use of real
>>> world knowledge, in spite of the fact that such knowledge (or at
>>> least 

>>> knowledge>) are posited to play a significant role in the learning
>>> of grammar in humans.  As such, these proofs say nothing whatsoever
>>> about the learning of NL grammars.
>>>
> 
> I fully agree the proofs don't take into account such stuff.
> And I believe such stuff is critical. Thus
> I've never claimed language learning was proved hard, I've just
> suggested evolution could have encrypted it.
> 
> The point I began with was, if there are lots of different locally
> optimal codings for thought, it may be hard to figure out which one is 
> programamed
> into the mind, and thus language learning could be a hard additional
> problem to producing an AGI. The AGI has to understand what the word
> "foobar" means, and thus it has to have (or build) a code module meaning
> ``foobar" that it can invoke with this word. If it has a different set
> of modules, it might be sunk in communication.
> 
> My sense about grammars for natural language, is that there are lots
> of different equally valid grammars that could be used to communicate.
> For example, there are the grammars of English and of Swahili. One
> isn't better than the other. And there is a wide variety of other
> kinds of grammars that might be just as good, that aren't even used in
> natural language, because evolution chose one convention at random.
> Figuring out what that convention is, is hard, at least Linguists have
> tried hard to do it and failed.
> And this grammar stuff is pretty much on top of, the meanings of 
> the words. It serves to disambiguate, for example for error correction
> in understanding. But you could communicate pretty well in pidgin, 
> without it, so long as you understand the meanings of the words.
> 
> The grammar learning results (as well as the experience of linguists,
> who've tried very hard to build a model for natural grammar) 
> I think, are indicative that this problem is hard, and it seems that
> this problem is superimposed above the real world knowledge aspect.

Eric,

Thankyou, I think you have focussed down on the exact nature of the claim.

My reply could start from a couple of different places in your above 
text (all equivalent), but the one that brings out the point best is this:

 >                                And there is a wide variety of other
 > kinds of grammars that might be just as good, that aren't even used in
 > natural language, because evolution chose one convention at random.
                                                               ^^^^^^

This is precisely where I think the flase assumption is buried.  When I 
say that grammar learning can be dependent on real world knowledge, I 
mean specifically that there are certain conceptual primitives involved 
in the basic design of a concept-learning system.  We all share these 
primitives, and [my claim is that] our language learning mechanisms 
start from those things.  Because both I and a native Swahili speaker 
use languages whose grammars are founded on common conceptual 
primitives, our grammars are more alike than we imagine.

Not only that, but if myself and the Swahili speaker suddenly met and 
tried to discover each other's languages, we would be able to do so, 
eventually, because our conceptual primitives are the same and our 
learning mechanisms are so similar.

Finally, I would argue that most cognitive systems, if they are to be 
successful in negotiating this same 3-D universe, would do best to have 
much the same conceptual primitives that we do.  This is much harder to 
argue, but it could be done.

As a result of this, evolution would not by any means have been making 
random choices of languages to implement.  It remains to be seen just 
how constrained the choices are, but there is at least a prima facie 
case to be made (the one I just sketched) that evolution was extremely 
constrained in her choices.

In the face of these ideas, your argument that evolution essentially 
made a random choice from a quasi-infinite space of possibilities needs 
a great deal more to back it up.  The grammar-from-conceptual-primitives 
idea is so plausible that the burden is on you to give a powerful reason 
for rejecting it.

Correct me if I am wrong, but I see no argument from you on this 
specific point (maybe there is one in your book .... but in that case, 
why say without qualification, as if it was obvious, that evolution made 
a random selection?).

Unless you can destroy the grammar-from-conceptual-primitives idea, 
surely these arguments about hardness of learning have to be rejected?




Richard Loosemore

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303




_______________________________________
James Ratcliff - http://falazar.com
New Torrent Site, Has TV and Movie Downloads! 
http://www.falazar.com/projects/Torrents/tvtorrents_show.php
 
---------------------------------
Sponsored Link

   Don't quit your job  - take classes online and earn your degree in 1 year. 
Start Today

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303

Reply via email to