On Wed, May 1, 2013 at 7:28 PM, Jim Bromer <jimbro...@hotmail.com> wrote:
>
> I want to write a text-based AGI program.  I hope it will be able to learn a 
> rudimentary form of a human language concerning a small virtual world 
> described in that language by the end of the year.  The virtual world would 
> include statements concerning the imaginary world which would be an extreme 
> simplification of some aspects of the real world.  So it would include 
> details about individuals and places at the level of small neighborhood.  It 
> will also contain some more general ideas concerning existence in this 
> extremely simple virtual world.

It would be interesting if you could produce a version of SHRDLU (
http://en.wikipedia.org/wiki/SHRDLU ), but instead of hand-coding the
language rules, have it learn the language by giving it commands and
questions and telling it how it should have responded.

AFAIK, this has never been done. Even OpenCog relies on hand-coded
language rules (RelEx, NatGen). It seems straightforward enough to do
this with a 100-200 word vocabulary (SHRDLU used about 50 words) using
a neural n-gram language model. It should be well within the computing
capacity of modern computers (which was not true in 1968 when SHRDLU
was started). It won't be AGI, of course, but it could lead to new
insights on how natural language is learned.

Probably most of the work will be in preparing training scripts. There
may be other domains where appropriate training data is already
available.

--
-- Matt Mahoney, mattmahone...@gmail.com


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to