On 10/21/06, Matt Mahoney <[EMAIL PROTECTED]> wrote:
- Original Message
From: Pei Wang <[EMAIL PROTECTED]>
To: agi@v2.listbox.com
Sent: Saturday, October 21, 2006 5:25:13 PM
Subject: Re: [agi] SOTA
>For example, the human mind and some other AI techniques handle
>structured knowledge mu
- Original Message
From: Pei Wang <[EMAIL PROTECTED]>
To: agi@v2.listbox.com
Sent: Saturday, October 21, 2006 5:25:13 PM
Subject: Re: [agi] SOTA
>For example, the human mind and some other AI techniques handle
>structured knowledge much better than NN does.
Is this because the brain is r
On 10/21/06, Matt Mahoney <[EMAIL PROTECTED]> wrote:
I read Pei Wang's paper, http://nars.wang.googlepages.com/wang.AGI-CNN.pdf
Some of the shortcomings of neural networks mentioned only apply to classical
(feedforward or symmetric) neural networks, not to asymmetric networks with
recurrent circ
With regard to the computational requirements of AI, there is a very clear
relation showing that the quality of a language model improves by adding time
and memory, as shown in the following table:
http://cs.fit.edu/~mmahoney/compression/text.html
And with the size of the training set, as show
Andrew,
I happen to have a list you asked. Last year I taught a graduate
course on NN (http://www.cis.temple.edu/~pwang/525-NC/CIS525.htm), and
afterwards wrote a paper
(http://nars.wang.googlepages.com/wang.AGI-CNN.pdf) to list its
strength and weakness, with respect to AGI.
In the paper, I onl
On Fri, 20 Oct 2006 22:15:37 -0400, Richard Loosemore wrote
> Matt Mahoney wrote:
> > From: Pei Wang <[EMAIL PROTECTED]>
> >> On 10/20/06, Matt Mahoney <[EMAIL PROTECTED]> wrote:
> >
> >>> It is not that we can't come up with the right algorithms.
> >>> It's that we don't have the
> >>> computi