Hello Linas. If you leave it to the learning mechanism, aren't you inevitably 
going to get crossed links? To take an even simpler example, "It was raining", 
your learning mechanism should work out three predictions:

  *   that "was" needs a subject (i.e. a preceding noun or pronoun).
  *   that any form of the verb RAIN needs the pronoun "it" as its subject (as 
in "It rained").
  *   that "was" needs (or at least accepts) an ing-form verb after it.

When you put these expectations together, you find a dependency triangle, with 
subject links from both verbs to "it" and dependency from "was" to "raining". 
Since both of the "it" links are the same ('subject'), there's no reason for 
assigning them to different levels of structure (deep vs surface), so you get a 
topological tangle.

Dick

On 16/11/2018 22:05, Linas Vepstas wrote:
I hit "send" too soon, without finishing the thought:

On Fri, Nov 16, 2018 at 3:02 PM Linas Vepstas 
<[email protected]<mailto:[email protected]>> wrote:
For example, this parse makes sense, and seems right:

     +-------->WV------->+
    +---->Wd-----+      |
    |      +Ds**c+-Ss*s-+---Pa--+
    |      |     |      |       |
LEFT-WALL the  dog.n was.v-d black.a

but there is another possibility, that kind-of makes sense (and perhaps 
language learning will find):

    +---->Wd---->+
    |            +-->adjcomp--->+
    |      +Ds**c+      +<-cop<-+
    |      |     |      |       |
LEFT-WALL the  dog.n   was    black

Here, adjcomp is "adjectival compliment" and "cop" was copula.  Some dependency 
grammars draw this graph. Some call it "predicative adjectival modifier". Lets 
quibble. Note that I did not draw an arrow from subject to verb. I could, I 
suppose.  Note that it is now IMPOSSIBLE to draw an arrow from root/left-wall 
to the verb, because it would require a
link-crossing, it would have to cross over the adjcomp arrow.

Thus, if you want to draw an arrow from root to head-verb, and also get a 
planar graph, you are not allowed to draw the adjcomp/predadj arrow.  That 
helps explain what LG does.

It also helps make clear that the no-links-crossing constraint is imperfect. It 
seems reasonable, but clearly, there is a violation in the above rather
trivial sentence!

OK, to finish this thought. Let us speculate what an MST parse of this sentence 
might be like. It depends on the MI values for the word-pairs MI(dog,was) 
MI(was,black) and MI(dog,black)  I don't know what these are, but clearly they 
will be different for a corpus of kids-lit, than a corpus of math texts.

Next question: what happens when words are sorted into categories?  What is 
MI(dog, some color)? What is MI(some animal, some color)? What is MI(physical 
object, some color)?

I don't have a good story here, except to say that copulas and predicative 
adjectives prsent maybe the simplest-possible example of a difficulty of moving 
from surface syntax (SSynt, what LG does) to deep syntax (DSynt, what MMT 
does). Yet, this move is a critical one.

I'm currently thinking of it as a graph-write rule, that converts the SSynt 
graph into a PLN graph

EvaluationLink
     PredicateNode "has color"
     ListLink
         Concept "dog"
         Concept "black"

Or, perhaps as Nil might like to write:

LambdaLink
     VariableList
          Variable $PHY
          Variable $COL
    AndLink
          EvaluationLink
              PredicateNode "has color"
              ListLink
                  Variable $PHY
                  Variable $COL
          InheritanceLink
                Variable $PHY
                Concept "physical object"
           InheritanceLink
                Variable $COL
                Concept "color"

Of course, even the above representation is wrong, in several ways, but 
nit-picking it at this stage is counter-productive.

The question is: given a learned grammar, with statistics, how to we get to the 
DSynt or the opencog variant?  Well, the now-quite-old Dekang Lin DIRT paper, 
and the newer-but-still-old Poon&Domingos unsupervised learning paper show the 
way.

Onward ho!

Linas
--
cassette tapes - analog TV - film cameras - you
--
You received this message because you are subscribed to the Google Groups 
"link-grammar" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to 
[email protected]<mailto:[email protected]>.
To post to this group, send email to 
[email protected]<mailto:[email protected]>.
Visit this group at https://groups.google.com/group/link-grammar.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/link-grammar/CAHrUA36aRbObkgMmOGvxO2eGr0RV6pcwrkVBUR-yua_LOYNFSg%40mail.gmail.com<https://groups.google.com/d/msgid/link-grammar/CAHrUA36aRbObkgMmOGvxO2eGr0RV6pcwrkVBUR-yua_LOYNFSg%40mail.gmail.com?utm_medium=email&utm_source=footer>.
For more options, visit https://groups.google.com/d/optout.

--
Richard Hudson (dickhudson.com)

[https://ipmcdn.avast.com/images/icons/icon-envelope-tick-green-avg-v1.png]<http://www.avg.com/email-signature?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=emailclient>
 Virus-free. 
www.avg.com<http://www.avg.com/email-signature?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=emailclient>

-- 
You received this message because you are subscribed to the Google Groups 
"opencog" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/opencog.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/opencog/4f71c04f-52c8-7a44-bc0d-c7eb211d3779%40ucl.ac.uk.
For more options, visit https://groups.google.com/d/optout.

Reply via email to