I'll repeat myself again using different words for Mike Tintner's benefit...
Understanding is the construction of concepts forming a mental model (a 
database of  "facts"), such that the model can be activated by sensory stimuli 
to recognize signs of language , and can be used to generate signs of language. 
(This is actually Roland Hausser's Database Semantics definition). 
Cheers,
~PM
From: [email protected]
To: [email protected]
Subject: Re: [agi] Step One  towards the real lingua franca of brain/AGI
Date: Thu, 4 Apr 2013 16:36:09 +0100







PM,
 
Like your exposition of your work which we discussed a while ago, your 
statement below doesn’t deal with the problem to be solved  – in this 
case:  what is the “language” of language (and AGI)?  [It’s a 
controversial but stimulating assumption that there is such a thing as a common 
“language” – or, to use another metaphor, “currency”.]
 
I’m suggesting that the use of sign language – the use of hand 
“graphics”/”figures” -  is one clue to that lingua franca.
 
Your definition of “understanding” is essentially a non-definition. It 
doesn’t explain what understanding *entails* – merely points out one v. narrow 
*application* of understanding, i.e. to language. Obviously, if you think about 
it, we also have to “understand” what is going on in a visual scene, or indeed 
understand sensory images of all kinds, including paintings, cartoons, maps, 
blueprints, x-rays, music  and many, many other things.  Understanding 
applies to  not merely registering, but successfully classifying EVERY form 
of input to a real world agent’s brain, not just one.
 
What I’m proposing is that there may be a common form of  “language” 
to all or nearly all these forms of process  -  a “language” wh. is 
actually radically different from the purely symbolic kind to which most 
AGI-ers 
cling very unimaginatively (in all senses).
 




From: Piaget Modeler 
Sent: Thursday, April 04, 2013 4:15 PM
To: AGI 

Subject: RE: [agi] Step One towards the real lingua franca of 
brain/AGI
 

I feel like I'm repeating myself: 
 
#7 - 
Understanding is learning a new language to the point of fluency. 
        
(When the words in the new language activate your language independent 
concepts, 

         
and you have created sufficient behaviors so that you can effortlessly generate 

         
expressions in the new 
language). 


        
Even when the new language is a signed language.




~PM
 



From: [email protected]
To: [email protected]
Subject: [agi] Step One 
towards the real lingua franca of brain/AGI
Date: Thu, 4 Apr 2013 12:16:06 
+0100






Language by mouth and by 
hand
April 3rd, 2013 in Other Sciences / Social Sciences 

Humans favor 
speech as the primary means of linguistic communication. Spoken languages are 
so 
common many think language and speech are one and the same. But the prevalence 
of sign languages suggests otherwise. Not only can Deaf communities generate 
language using manual gestures, but their languages share some of their design 
and neural mechanisms with spoken languages.

New research by Northeastern 
University's Prof. Iris Berent further underscores the flexibility 
of human language and its robustness across both spoken and signed channels of 
communication.

In a paper published in PLOS ONE, Prof. Berent and her 
team show that English speakers can learn to rapidly recognize key structures 
of 
American Sign Language (ASL), despite no previous familiarity with this 
language.

Like spoken languages, signed languages construct words from 
meaningless syllables (akin to can-dy in English) and distinguish them from 
morphemes (meaningful units, similar to the English can-s). The research group 
examined whether non-signers might be able to discover this structure.

In 
a series of experiments, Prof. Berent and her team (Amanda Dupuis, a graduate 
student at Northeastern University, and Dr. Diane Brentari of the University of 
Chicago) asked English speakers to identify syllables in novel ASL signs. 
Results showed that these non-signing adults quickly learned to identify the 
number of signed syllables (one vs. two), and they could even distinguish 
syllables from morphemes.

Remarkably, however, people did not act as 
indiscriminate general-purpose learners. While they could easily learn to 
discern the structure of ASL signs, they were unable to do so when presented 
with signs that were equally complex, but violated the structure of ASL (as 
well 
as any known human language).

The results suggest that participants 
extended their linguistic knowledge from spoken language to sign language. This 
finding is significant because it shows that linguistic principles are 
abstract, 
and they can apply to both speech and sign. Nonetheless, Dr. Berent explains, 
language is also constrained, as not all linguistic principles are equally 
learnable. "Our present results do not establish the origin of these 
limitations- whether they only result from people's past experience with 
English, or from more general design properties of the language system. But 
regardless of source, language transcends speech, as people can extend their 
linguistic knowledge to a new modality."

Provided by Northeastern 
University

"Language by mouth and by hand." April 3rd, 2013. 
http://phys.org/news/2013-04-language-mouth.html



  
  
    AGI | Archives  | Modify Your Subscription
    


  
  
    AGI | Archives  | Modify 
      Your Subscription 
    


  
    
      
      AGI | Archives

 | Modify
 Your Subscription


      
    
  

                                          


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to