I might be wrong but after reading some of the ADIOS papers I cannot
tell the difference between their approach and the common Bayesian
practice of looking at n-grams frequencies in computational
linguistics, often used (for instance by Google) for spell/grammar
checking, word/sentence suggestions, web related content (like the
Google sets project), word/phrase disambiguation and so on. In other
words, if you google sentences and pick the ones with higher
frequencies coming from their corpus (their indexed web) you will end
up with the right ones--spelling and grammar speaking. The same for
the interchangeability approach also based in gram frequency
(including this "end phrase" idea) that allows the method to generate
new well formed sentences.

Anyway, I agree that this doesn't solve the semantical problem, but I
do think it points to the right direction since this n-grams can be
thought as having a joint meaning so they can be treated as semantical
atoms.

Perhaps the novelty from ADIOS is that they have implemented a package
and they have found several nice applications? but I don't think they
have something to patent other than the package itself, but if they
do, they shouldn't (I just have troubles when people want to patent
algorithms).



On Sun, Apr 6, 2008 at 1:29 AM, Ben Goertzel <[EMAIL PROTECTED]> wrote:
> I looked through the ADIOS papers...
>
>  It's interesting work, and it reminds me of a number of other things, 
> including
>
>  -- Borzenko's work, http://proto-mind.com/SAHIN.pdf
>
>  -- Denis Yuret's work on mutual information based grammar learning,
>  from the late 90's
>
>  -- Robert Hecht-Nielsen's much-publicized work a couple years back, on
>  automated language learning and generation
>
>  -- Tony Smith's work on automated learning of function-word based
>  grammars from text, done in his MS thesis from University of Calgary
>  in the 90's
>
>  Looking at these various things together, it does seem clear that one
>  can extract a lot of syntactic structure from free text in an
>  unsupervised manner.
>
>  It is unclear whether one can get the full syntactic subtlety of
>  everyday English though.  Every researcher in this area seems to get
>  to a certain stage (mining the simpler aspects of English syntax), and
>  then never get any further.
>
>  However, I have another complaint to make.  Let's say you succeed with
>  this, and make an English-language-syntax recognizer that works, say,
>  as well as the link parser, by pure unsupervised learning.  That is
>  really cool but ... so what?
>
>  Syntax parsing is already not the bottleneck for AGI, we already have
>  decent parsers.  The bottleneck is semantic understanding.
>
>  Having a system that can generate random sentences is not very useful,
>  nor is having a bulky inelegant automatically learned formal-grammar
>  model of English.
>
>  If one wants to hand-craft mapping rules taking syntax parses into
>  logical relations, one is better off with a hand-crafted grammar than
>  a messier learned one.
>
>  If one wants to have the mapping from syntax into semantics be
>  learned, then probably one is better off having syntax be learned in a
>  coherent overall experiential-learning process -- i.e. as part of a
>  system learning how to interact in a world -- rather than having
>  syntax learned in an artificial, semantics-free manner via
>  corpus-mining.
>
>  In other words: suppose you could make ADIOS work for real ... how
>  would that help along the path of AGI?
>
>  -- Ben G
>
>
>
>  On Sat, Apr 5, 2008 at 8:46 AM, Evgenii Philippov <[EMAIL PROTECTED]> wrote:
>  >
>  >  On Sat, Apr 5, 2008 at 7:37 PM, Ben Goertzel <[EMAIL PROTECTED]> wrote:
>  >  >  For instance, I'll be curious whether ADIOS's automatically inferred
>  >  >  grammars can deal with recursive phrase structure, with constructs
>  >  >  like "the person with whom I ate dinner", and so forth....
>  >
>  >  ADIOS papers have a lot of remarks like "recusion is not implemented",
>  >  but I think it IS able to deal with THIS kind of recusion... But this
>  >  is TBD---I am not sure.
>  >
>  >
>  >
>  >  e
>  >
>  >  >
>  >  >
>  >  >
>  >  >  On Sat, Apr 5, 2008 at 7:57 AM, Evgenii Philippov <[EMAIL PROTECTED]> 
> wrote:
>  >  >  >
>  >  >  >  Hello folks,
>  >  >  >
>  >  >  >
>  >  >  >  On Thu, Mar 27, 2008 at 11:06 PM, Ben Goertzel <[EMAIL PROTECTED]> 
> wrote:
>  >  >  >  >  In general, I personally have lost interest in automated 
> inference of grammars
>  >  >  >  >  from text corpuses, though I did play with that in the 90's (and 
> got bad results
>  >  >  >  >  like everybody else).
>  >  >  >
>  >  >  >  Uh oh! My current top-priority is playing with ADIOS algorithm for
>  >  >  >  unsupervised grammar learning, which is based on extended Hidden
>  >  >  >  Markov Models. Its results are plainly fantastic---it is able to
>  >  >  >  create a working grammar not only for English, but also for many 
> other
>  >  >  >  languages, plus languages with spaces removed, plus DNA structure,
>  >  >  >  protein structure, etc etc etc. Some results are described in Zach
>  >  >  >  Solan's papers and the algorithm itself is described in his
>  >  >  >  dissertation.
>  >  >  >
>  >  >  >  http://www.tau.ac.il/~zsolan/papers/ZachSolanThesis.pdf
>  >  >  >  http://adios.tau.ac.il/
>  >  >  >
>  >  >  >  And its grammars are completely comprehensible for a human. (See the
>  >  >  >  homepage, papers and the thesis for diagrams.)
>  >  >  >
>  >  >  >  Also, they can very easily be used for language generation, and Z
>  >  >  >  Solan did a lot of experiments with this.
>  >  >  >
>  >  >  >  It has no relation to Link Grammar though.
>  >  >  >
>  >  >  >
>  >  >  >  >  Automated inference of grammar from language used in embodied 
> situations
>  >  >  >  >  interests me a lot ... and "cheating" via using hand-created NLP 
> tools may
>  >  >  >  >  be helpful too...
>  >  >  >  >
>  >  >  >  >  But I sort of feel like automated inference of grammars from 
> corpuses may
>  >  >  >  >  be a HARDER problem than learning grammar based on embodied 
> experience...
>  >  >  >  >  which is hard enough...
>  >  >  >
>  >  >  >  ADIOS solves this hard problem easily. Some or all modifications of
>  >  >  >  ADIOS are memory-intensive though, I did not implement it completely
>  >  >  >  yet.
>  >  >  >
>  >  >  >  I am doing it in Java.
>  >  >  >
>  >  >  >  Also, Google Scholar http://scholar.google.com/ shows no evidence of
>  >  >  >  substantial subsequent work of other people in the direction of 
> ADIOS.
>  >  >  >
>  >  >  >
>  >  >  >  >  OTOH we're talking about research here and nobody's intuition is 
> perfect ...
>  >  >  >  >  so what you're describing could potentially be a great GSOC 
> project mentored
>  >  >  >  >  by YOU not me .. I don't want to impose my own personal 
> intuition and taste
>  >  >  >  >  on the whole OpenCog GSOC enterprise...
>  >  >  >
>  >  >  >  --
>  >  >  >  Best regards,
>  >  >  >  Gamma
>  >  >  >  Evgenii Philippov
>  >  >  >
>  >  >  >
>  >  >  >
>  >  >  >  >
>  >  >  >
>  >  >
>  >  >
>  >  >
>  >  >
>  >  > --
>  >
>  > >  Ben Goertzel, PhD
>  >  >  CEO, Novamente LLC and Biomind LLC
>  >  >  Director of Research, SIAI
>  >  >  [EMAIL PROTECTED]
>  >  >
>  >  >  "If men cease to believe that they will one day become gods then they
>  >  >  will surely become worms."
>  >  >  -- Henry Miller
>  >  >
>  >  >  >
>  >  >
>  >
>  >
>  >
>  >  --
>  >  Best regards,
>  >
>  >
>  > Evgenii Philippov
>  >
>  >  --~--~---------~--~----~------------~-------~--~----~
>  >  You received this message because you are subscribed to the Google Groups 
> "OpenCog.org (Open Cognition Project)" group.
>  >  To post to this group, send email to [EMAIL PROTECTED]
>  >  To unsubscribe from this group, send email to [EMAIL PROTECTED]
>  >  For more options, visit this group at 
> http://groups.google.com/group/opencog?hl=en
>  >  -~----------~----~----~----~------~----~------~--~---
>  >
>  >
>
>
>
>  --
>  Ben Goertzel, PhD
>  CEO, Novamente LLC and Biomind LLC
>  Director of Research, SIAI
>  [EMAIL PROTECTED]
>
>  "If men cease to believe that they will one day become gods then they
>  will surely become worms."
>  -- Henry Miller
>
>  -------------------------------------------
>  agi
>  Archives: http://www.listbox.com/member/archive/303/=now
>  RSS Feed: http://www.listbox.com/member/archive/rss/303/
>  Modify Your Subscription: http://www.listbox.com/member/?&;
>  Powered by Listbox: http://www.listbox.com
>

-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=98558129-0bdb63
Powered by Listbox: http://www.listbox.com

Reply via email to