Wow, Dick, thanks a lot for the detailed and thoughtful response!

Now I'm afraid it will be my turn to delay a bit, as I'll be traveling
and such for the next week and a half, and may not get time to dig
into this again till my return.   But I'm eager to figure out how to
adapt my preliminary formalization to account for the added subtleties
you note ;)

-- Ben


On Wed, Apr 12, 2017 at 7:25 PM, Richard Hudson <[email protected]> wrote:
> Dear Ben,
>
> Thanks very much for your musings, and sorry for the long delay at my end.
> I'm sure it's time to sort out the formal properties of WG structures, and
> I'm equally sure that you're better at this kind of thinking than I am. What
> I can report is a major rethink on WG word order rules, which is summarised
> in an article just published in the Journal of Linguistics. A
> not-quite-final version is at http://dickhudson.com/papers/#2017 , and it's
> about pied-piping, which is another exception to  the simple picture (as in
> "the book in which I saw it"). The exceptional feature here is that "in"
> takes its position from "which", in spite of the fact that "which" depends
> on "in". Here's what I now think:
>
> We need to distinguish between constraints on word order and actual order.
> The mechanism of landmarks is for constraints, but actual order is much more
> concrete, with a 'next' link from one word to the next. The difference is
> clearest in a language with free order:  every order of sister dependencies
> is grammatical, so there's no need for a distinction between before/after,
> but 'next' links are still essential because we're using a network, and
> networks don't have a left-right dimension. If we use > as the 'next' link,
> we'd distinguish John>snores from snores>John, but they would both have the
> same 'landmark' structure. (The 'landmark' system is still needed even in a
> free-order language, because I gather that clauses never scramble freely; so
> in "She warned Mary John snores", the landmark for John would be snores, not
> warned.)
> The distinction between before/after landmarks doesn't work in default
> inheritance because there's no mechanism whereby 'before' can override
> 'after' (or vice versa). In my 2010 book I introduced a new basic relation
> 'or' specially to handle this choice, but I couldn't think of any other good
> use for 'or', so it was very very suspect. Instead, I now have a much
> simpler solution: every word (indeed, every object in the mental universe)
> has a position, and (like every other property) one position can override
> another; and it's positions, not words, that are related by 'before' and
> 'after' links, so 'landmark transitivity' is replaced by 'position
> transitivity': if A is-before B and B is-before C, then A is-before C, etc.
> For example, suppose you want to say that a dependent D follows its parent
> P:
>
> D's landmark is P.
> Each word has a position pos(x).
> pos(D) is-after pos(P).
>
> As before, the landmark of a word W is typically its parent, but
>
> if W has more than one parent linked by a chain of dependencies p1-p2...,
> where pi+1 depends on pi, the landmark is typically p1
>
> this defines 'raising', but 'lowering' is also possible, as in German
> partial VP fronting (eine Concorde gelandet ist hier nie = a Concorde landed
> has here never = a C has never landed here).
>
> in a pied-piping construction, W may have a 'pipee' (e.g. "in" is the pipee
> of "which") which inherits the landmark relations of W.
>
> Returning to #1, the constraints apply to the actual order:
>
> 'Position transitivity' applies so that a word's position has the same
> relation as its landmark's position to that of the landmark's landmark.
> Given a pair of adjacent words W1 > W2, each word has a position: pos(W1)
> is-before pos(W2).
> This order is grammatical provided it's compatible with the constraints and
> position transitivity.
>
> I don't think you need to bring 'proxies' into the picture, because they
> essentially add extra dependencies and only affect word order indirectly.
> Your 'sister transitivity' sounds more like a statement about actual order
> than about constraints. There may be a constraint on the order of sisters
> (e.g. I gave the students high marks vs: *I gave high marks the students),
> but it's not necessary, whereas they obviously have to be in some actual
> order. (Sister constraints are an outstanding gap in the theory, I'm
> afraid.)
> Mutual dependency (your phrases with two heads) is already covered, I think,
> because both dependencies lead to the same actual ordering.
>
> I hope that covers it. That was very hard work for a non-mathematician!
>
> Best wishes, Dick
>
>
> On 05/04/2017 23:26, Ben Goertzel wrote:
>
> Some quasi-mathematical-linguistic musings...
>
> Reviewing a bunch of familiar stuff in my mind, I’m trying to take an
> algebraic view of Word Grammar….   This is presumably equivalent to
> pregroup grammar under appropriate restrictions but it’s maybe a more
> linguistics-ish way to look at it…
>
> Consider a set of words interlinked by ordered dependency links (so
> each link has a head corresponding to the parent, and a tail
> corresponding to the child).  For reasons including those to be
> described below, it is useful to consider these dependency links as
> typed.
>
> Word Grammar tells us how, given such a set of words and links, to
> construct a set of additional (ordered) “landmark links” between the
> words.
>
> The rules thereof are as follows…
>
> The parent is the “landmark” of the child.
>
> In some cases a word may have more than one parent. In this case, the
> rule is that the landmark is the one that is superordinate to all the
> other parents. In the rare case that two words are each others’
> parents, then either may serve as the landmark.
>
> A Before landmark is one where the child is before the parent; an
> After landmark is one where the child is after the parent.
>
> Rules of “landmark transitivity” are:
>
> * Subordinate transitivity: If A is a Before/After landmark for B, and
> B is some kind of landmark for C, then A is a Before/After landmark
> for C
>
> * Sister transitivity: If A is a landmark for B, and A is also a
> landmark for C, then B is also a landmark for C
>
> * Proxy links: For certain special types T of dependency link, if A
> and B are joined by a link of type T, then if A is a landmark for C, B
> is also a landmark for C
>
> The “head” of a set of words is a root of the digraph of landmark
> links in that set of words
>
> Restricting attention momentarily to the case of phrases with only one
> head, one way to look at this is: The landmark transitivity rules tell
> what happens when we carry out operations such as
>
> P1 +_T P2
>
> (putting a dependency link between the head of P1 and the head of P2,
> with P1 to the left and being at the child end of the link), or
>
> P1 +_T’ P2
>
> (putting a dependency link between the head of P1 and the head of P2,
> with P1 to the left and being at the parent end of the link)
>
> noting that this operation is not commutative, and also that the
> dependency link may have a type T which may be important (e.g. due to
> the existence of proxy links).
>
> These operations at on the space of graphs whose nodes are words and
> whose linked are either typed, ordered dependency links, or ordered
> landmark links; and for which the landmark links are consistent
> according to the rules laid out above.
>
> The landmark transitivity rules tell where the landmark links go in
> the combined structures P1 +_T P2 and P1 +_T’ P2, in a way that will
> maintain the consistency of the rules regarding landmarks
>
> It is not hard to see that, according to the rules of landmark
> transitivity, the free algebra formed by the multiple operations +_T,
> +_T’ is distributive, associative, and noncommutative
>
> There is one hole in the above; we haven’t dealt with cases where a
> phrase has more than one head, because two words are each others’
> parents.  The easiest way to look at this formally seems to be to
> introduce operations +_Tij, where
>
> P1 +_Tij P2
>
> builds a dependency link of type T from the i’th head of P1 to the
> j’th head of P2.  We would also have operations of the form
>
> P1 +_Tij’ P2
>
> We can then see that  the free algebra formed by the multiple
> operations +_T, +_T’, +_Tij, +_T’ij is distributive, associative, and
> noncommutative...
>
> A next step would be to make all these links (represented here by +
> operators) probabilistically weighted.   But I'm out of time just now
> so that will be saved for later ;) ...
>
> ben
>
>
>
>
> --
> Richard Hudson (dickhudson.com)
>
> --
> You received this message because you are subscribed to the Google Groups
> "link-grammar" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> To post to this group, send email to [email protected].
> Visit this group at https://groups.google.com/group/link-grammar.
>
> For more options, visit https://groups.google.com/d/optout.



-- 
Ben Goertzel, PhD
http://goertzel.org

"I am God! I am nothing, I'm play, I am freedom, I am life. I am the
boundary, I am the peak." -- Alexander Scriabin

-- 
You received this message because you are subscribed to the Google Groups 
"opencog" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/opencog.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/opencog/CACYTDBfiWC%3Dj9JjHgmrfKc8_z1QHgEmPuucCCE7eb92E8emmPg%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to