This is based purely on reading the wikipedia entry on Operator
grammar, which I find very interesting. I'm hoping someone out there
knows enough about this to answer some questions :^) 

Wikipedia says that various quantities are "learnable" because they can
in principle be determined by data. What is known about whether they
are efficiently learnable, e.g. (a) whether a child would acquire enough
data to learn the language and (b) whether given the data, learning
the language would be computationally feasible? (e.g. polynomial
time.)

Keep in mind that, you have to learn the language well enough to 
deal with the fact that you can generate and understand (and thus
pretty much have to be able to calculate the likelihood of) a
virtually infinite number of sentences never before seen.

I presume the answer to these two questions (how much data you need
and how easy it is to learn from it) will depend on how you
parametrize the various knowledge you learn. So, for example,
take a word that takes two arguments. One way to parametrize 
the likelihood of various arguments would be with a table over
all two word combinations, the i,j entry gives the likelihood
that the ith word and the jth word are the two arguments.
But most likely, in reality, the likelihood of the jth word
will be much pinned down conditional on the ith. So one might
imagine parametrizing these "learned" coherent selection tables
in some powerful way that exposes underlying structure.
If you just use lookup tables, I'm guessing learning is
computationally trivial, but data requirements are prohibitive.
On the other hand, if you posit underlying structure, you can no
doubt lower the amount of data required to be able to deal with
novel sentences, but I would expect you'd run into the standard
problems that finding the optimal structure becomes NP-hard.
At this point, a heuristic might or might not suffice, it would
be an empirical question.

Is there empirical work with this model?

Also, I don't see how you can call a model "semantic" when it makes
no reference to the world. The model as described by Wikipedia
could have the capability of telling me whether a sentence is
natural or highly unlikely, but unless I misunderstand something,
there is no possibility it could tell me whether a sentence
describes a scene.

Matt> --- Chuck Esterbrook <[EMAIL PROTECTED]> wrote:

>> Any opinions on Operator Grammar vs. Link Grammar?
>> 
>> http://en.wikipedia.org/wiki/Operator_Grammar
>> 
>> http://en.wikipedia.org/wiki/Link_grammar
>> 
>> Link Grammar seems to have spawned practical software, but Operator
>> Grammar has some compelling ideas including coherent selection,
>> information content and more. Maybe these ideas are too hard or too
>> ill-defined to implement?
>> 
>> Or, in other words, why does Link Grammar win the GoogleFight?
>> 
Matt> 
http://www.googlefight.com/index.php?lang=en_GB&word1=%22link+grammar%22&word2=%22operator+grammar%22
>> (http://tinyurl.com/yvu9xr)

Matt> Link grammar has a website and online demo at
Matt> http://www.link.cs.cmu.edu/link/submit-sentence-4.html

Matt> But as I posted earlier, it gives the same parse for:

Matt> - I ate pizza with pepperoni.  - I ate pizza with a friend.  - I
Matt> ate pizza with a fork.

Matt> which shows that you can't separate syntax and semantics.  Many
Matt> grammars have this problem.

Matt> Operator grammar seems to me to be a lot closer to the way
Matt> natural language actually works.  It includes semantics.  The
Matt> basic constraints (dependency, likelihood, and reduction) are
Matt> all learnable.  It might have gotten less attention because its
Matt> main proponent, Zellig Harris, died in 1992, just before it
Matt> became feasible to test the grammar in computational models
Matt> (e.g.  perplexity or text compression).  Also, none of his
Matt> publications are online, but you can find reviews of his books
Matt> at http://www.dmi.columbia.edu/zellig/


Matt> -- Matt Mahoney, [EMAIL PROTECTED]

Matt> ----- This list is sponsored by AGIRI:
Matt> http://www.agiri.org/email To unsubscribe or change your
Matt> options, please go to:
Matt> http://v2.listbox.com/member/?&;

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=e9e40a7e

Reply via email to