Hello,
Thanks for that. I've taken a look at the source code, and I see that
the bulk of the processing is done in C, with R acting as a wrapper.
Below is the function I think is doing the training in the network.
I'm guessing it's the standard Backpropagation with a decay term
algorithm?
On Thu, 2006-11-23 at 09:39 +, Wee-Jin Goh wrote:
Hello,
Thanks for that. I've taken a look at the source code, and I see that
the bulk of the processing is done in C, with R acting as a wrapper.
Below is the function I think is doing the training in the network.
I'm guessing it's
Greetings list,
I've just swapped from the neural package to the nnet package and
I've noticed that the training is orders of magnitude faster, and the
results are way more accurate.
This leads me to wonder, what training algorithm is nnet using? Is
it a modification on the standard
Just to add to this, I also need to know what language is the nnet
package written in? Is it in pure R or is it a wrapper for a C
library. It really is performing very quickly, going through 200
epochs in seconds when it took neural minutes, and neural is
written in R.
It all may sound
Wee-Jin Goh wjgoh at brookes.ac.uk writes:
Just to add to this, I also need to know what language is the nnet
package written in? Is it in pure R or is it a wrapper for a C
library.
As usual, you can download the full source to find out what you want, but it's a
bit hidden. Simply