Shane, The following is what I wrote in the memo, which basically agrees with what you said.
"In principle, these three frameworks [dynamical system, inferential system, and computational system] are equivalent in their expressive power, in the sense that a virtual machine defined in one framework can be implemented by another virtual machine defined in another framework. ... ... Even so, for a given problem, it may be easier to find solutions in one framework than in the other frameworks. Therefore, the frameworks are not equivalent in practical applications." Pei On 12/18/05, Shane Legg <[EMAIL PROTECTED]> wrote: > Pei, > > > > > What you seem to be criticising in your memo is what I'd call > > > "feed forward neural networks". > > > > I see what you mean, though in the memo I didn't rule out feedback. > > Recurrence makes all the difference... > > For example, consider a very simple neural network model: > Rational activations, fixed topologies, no learning rules or weight > changes allowed, and just a trivial linear activation function. > > If you allow only feed forward connections then such a network > can only compute linear functions of inputs. Pretty dumb. > > On the other hand, if you allow these networks to have recurrent > connections, the model is in fact Turing complete. Indeed people > have built networks in this model that simulate classical universal > Turing machines (see the work of Siegelman). There are even > compilers for high level languages like Occam that will output > a recurrent neural network to execute your program (see the work > of Neto for example). > > Anyway my point is, once you allow recurrent connections even > trivial types of neural networks can, in theory, compute anything. > This is why I prefer to think of NNs as a computational paradigm > rather than a class of techniques, algorithms or methods. > > You could argue that NARS is a better way of thinking about or > expressing AGI, or something like that, perhaps. Just as you > might argue that C is a better way of programming than machine > code. However the limitations aren't fundamental, indeed I could > write a NARS system in Occam and then compile it to run as > a neural network. > > Shane > > > > > > > ________________________________ > To unsubscribe, change your address, or temporarily deactivate your > subscription, please go to > http://v2.listbox.com/member/[EMAIL PROTECTED] > > ------- To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED]
