--- Richard Loosemore <[EMAIL PROTECTED]> wrote:

> Matt Mahoney wrote:
> 
> > I doubt you could model sentence structure usefully with a neural network
> > capable of only a 200 word vocabulary.  By the time children learn to use
> > complete sentences they already know thousands of words after exposure to
> > hundreds of megabytes of language.  The problem seems to be about O(n^2). 
> As
> > you double the training set size, you also need to double the number of
> > connections to represent what you learned.
> > 
> > 
> > -- Matt Mahoney, [EMAIL PROTECTED]
> 
> The problem does not need to be O(n^2).
> 
> And remember:  I used a 200 word vocabulary in a program I wrote 16 
> years ago, on a machine with only one thousandth of today's power.
> 
> And besides, solving the problem of understanding sentences could easily 
> be done in principle with even a vocabulary as small as 200 words.
> 
> Richard Loosemore.

What did your simulation actually accomplish?  What were the results?  What do
you think you could achieve on a modern computer?




-- Matt Mahoney, [EMAIL PROTECTED]

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&user_secret=8eb45b07

Reply via email to