On Friday 12 September 2003 01:56 pm, Ovid wrote:
> --- Mark Kvale <[EMAIL PROTECTED]> wrote:
> > a) AI::NeuralNet::Mesh - trains up multi-layer perceptrons, a type of
> > feedforward neural net. It has good documentation. For your problem, I
> > would reccommend a 3 layer net, with one input, one hidden and one
> > output layer, with tanh activation fuctions.
>
> Hi all,
>
> Thanks to everyone for input.  I decided to start first with
> AI::NeuralNet::Mesh because it looks easy, but so far, I can't seem to
> train my computer to learn binary.  Below my signoff is the full program
> that I wrote.  Here are the results of my test run:

snip!

Right off the bat, you have 4 binary inputs, but only 3 input nodes:

my $net = AI::NeuralNet::Mesh->new(3,7,1);

Your neural net is telling you that the most significant digit is relevant :)

Second, you will  have 4 inputs, and 7 hidden, resulting in 28 connections.
The second layer of weights has seven connections, giving a total of 35
parameters. Whether this is enough data depends on the details of the
learning algorithm. See below.

Below is a program that parametrizes number of examples and hidden units and
uses a cross validation type of method to test network efficacy.

The results are

Number of examples used for training and testing: 10
hidden: 1       RMS error per trial: 5.47722557505166
hidden: 2       RMS error per trial: 4.7116875957559
hidden: 3       RMS error per trial: 1.44913767461894
hidden: 4       RMS error per trial: 2.77488738510232
hidden: 5       RMS error per trial: 1.54919333848297
hidden: 6       RMS error per trial: 1.92353840616713
hidden: 7       RMS error per trial: 2.38746727726266
hidden: 8       RMS error per trial: 2.72029410174709
hidden: 9       RMS error per trial: 2.0976176963403
hidden: 10      RMS error per trial: 2.91547594742265

Number of examples used for training and testing: 50
hidden: 1       RMS error per trial: 3.54118624192515
hidden: 2       RMS error per trial: 1.75499287747842
hidden: 3       RMS error per trial: 1.05830052442584
hidden: 4       RMS error per trial: 1.90787840283389
hidden: 5       RMS error per trial: 1.78885438199983
hidden: 6       RMS error per trial: 1.80554700852678
hidden: 7       RMS error per trial: 2.57681974534503
hidden: 8       RMS error per trial: 2.36220236220354
hidden: 9       RMS error per trial: 3.48998567332303
hidden: 10      RMS error per trial: 2.19544984001001

There are couple of things to notice. First, eroors decrease then increase as
explained in a previous email. The optimum number of hidden units seems to be
around 3 for both 10 and 50 examples. Second, AI::NeuralNet::Mesh doesn't
seem to bootstrap the given examples, so it pays to present the ones you have
mutilpe time. This is evidenced by the second set of results, which show
lower error and less variance in the erors.

At its best, a single layer with 3 hidden units gets on average to within
 1.06 of the correct answer over all examples. Which is OK, but not perfect.
 To do better, you may want to
1) try more examples,
2) add a second layer,
3)  alter activation functions for the hidden and output nodes,
4) Since backprop is a gradient method, it may get stuck in local minima.
 Thus it pays to randomize initial weights and run it multiple times to find
 a better network.

Becasue of this and other reasons, the backpropagation learning method used
 in this module is one of the weaker in use for NN learning. Quickprop or
 Rprop (not implemented here) may do a better job. A C/C++ program that
 implements a wide variety of NNs and learning methods is SNNS:

http://www-ra.informatik.uni-tuebingen.de/SNNS/

and there are propbably others that are good, too.

Neural nets are one of the simpler machine learing paradigms, but they are
not turnkey algorithms. It is an art to pick relevant input variables and
some playing around is needed to achieve best results.

        -Mark


#!/usr/bin/perl

use strict;
use AI::NeuralNet::Mesh;

my $num_ex = shift;
my @bin = qw(000 001 010 011 100 101 110 111);

print "Number of examples used for training and testing: $num_ex\n";
foreach my $hidden (1..10) {

   # create new network
   my $net = AI::NeuralNet::Mesh->new(3,$hidden,1);

   # create $num_ex random examples
   my $examples = [];
   foreach (1..$num_ex) {
      my $dec = int rand 8;
      my @digits = split //, $bin[$dec];
      push @$examples, [EMAIL PROTECTED], [$dec];
   }

   # train the NN
   $net->learn_set( $examples);

   # test the NN
   my $avg = 0;
   foreach (1..$num_ex) {
      my $dec = int rand 8;
      my @digits = split //, $bin[$dec];
      my $pred = $net->run( [EMAIL PROTECTED])->[0];
      $avg += ($dec - $pred) * ($dec - $pred);
   }
   $avg /= $num_ex;

   # print output
   print "hidden: $hidden\tRMS error per trial: ", sqrt $avg, "\n";
}

Reply via email to