Chris,

Yes, this should work if you get the format of the inputs/targets just right. (But, this isn't using the "pattern" feature of Conx; you're just using the named layer method.) Here is AND again, but notice the list inside the dicts:

from pyrobot.brain.conx import *

# create network
net = Network()

# create layers
net.addLayer('a', 1)
net.addLayer('b', 1)
net.addLayer('f', 1)
net.connect('a', 'f')
net.connect('b', 'f')

# set training corpus:
corpus = [
   [dict(a=[0.0], b=[0.0]), dict(f=[0.0])],
   [dict(a=[0.0], b=[1.0]), dict(f=[0.0])],
   [dict(a=[1.0], b=[0.0]), dict(f=[0.0])],
   [dict(a=[1.0], b=[1.0]), dict(f=[1.0])],
]

net.setInputsAndTargets(corpus)

# set learning parameters
net.setEpsilon(0.5)
net.setTolerance(0.2)
net.setReportRate(1)

# learn
net.train()

It should learn in about 16 sweeps/epochs.

Patterns, on the other hand, are if you want to name a particular vector (not layer). For example:

p = {"apple": [1, 0, 0, 1, 0],
     "banana": [1, 1, 1, 0, 0]}

If you want to do that, then you would call net.setPatterns(p), and you could combine that with the named layers. Here is an example with both:

from pyrobot.brain.conx import *

# create network
net = Network()

# create layers
net.addLayer('a', 1)
net.addLayer('b', 1)
net.addLayer('f', 1)
net.connect('a', 'f')
net.connect('b', 'f')

p = {"zero": [0.0],
     "one" : [1.0]}

net.setPatterns(p)

# set training corpus:
corpus = [
   [dict(a="zero", b="zero"), dict(f="zero")],
   [dict(a="zero", b="one"), dict(f="zero")],
   [dict(a="one", b="zero"), dict(f="zero")],
   [dict(a="one", b="one"), dict(f="one")],
]

net.setInputsAndTargets(corpus)

# set learning parameters
net.setEpsilon(0.5)
net.setTolerance(0.2)
net.setReportRate(1)

# learn
net.train()

Notice that the lists inside the dicts is now in the pattern itself.

-Doug

Chris S wrote:
Hi Doug,

A while ago we talked about giving networks named inputs and outputs
for training and propagation. You mentioned this was implemented, but
I've been having problems using this feature.

As a test I've tried creating a network for simulating a simple
logical AND. The two inputs are named 'a' and 'b', and the output is
named 'f'. Below is my sample code, and the error it generates. Am I
using this feature correctly?

Regards,
Chris

from pyrobot.brain.conx import *

# create network
net = Network()

# create layers
net.addLayer('a', 1)
net.addLayer('b', 1)
net.addLayer('f', 1)
net.connect('a', 'f')
net.connect('b', 'f')

# set patterns
patterns = [
   [dict(a=0.0, b=0.0), dict(f=0.0)],
   [dict(a=0.0, b=1.0), dict(f=0.0)],
   [dict(a=0.0, b=0.0), dict(f=0.0)],
   [dict(a=0.0, b=0.0), dict(f=1.0)],
]
net.setInputsAndTargets(patterns)

# set learning parameters
net.setEpsilon(0.5)
net.setTolerance(0.2)
net.setReportRate(1)

# learn
net.train()


Conx, version 1.229 (regular speed)
Conx using seed: 1164933601.75
Traceback (most recent call last):
 File "C:\Documents and Settings\TEMP\Desktop\nntest.py", line 37, in ?
   net.train()
 File "C:\Program
Files\Python24\lib\site-packages\pyrobot\brain\conx.py", line 1685, in
train
   (tssErr, totalCorrect, totalCount, totalPCorrect) = self.sweep()
 File "C:\Program
Files\Python24\lib\site-packages\pyrobot\brain\conx.py", line 1820, in
sweep
   (error, correct, total, pcorrect) = self.step( **datum )
 File "C:\Program
Files\Python24\lib\site-packages\pyrobot\brain\conx.py", line 1752, in
step
   self.propagate(**args)
 File "C:\Program
Files\Python24\lib\site-packages\pyrobot\brain\conx.py", line 1964, in
propagate
   self.copyActivations(layer, args[key])
 File "C:\Program
Files\Python24\lib\site-packages\pyrobot\brain\conx.py", line 1415, in
copyActivations
   layer.copyActivations(vector[start:start+layer.size])
TypeError: unsubscriptable object
_______________________________________________
Pyro-users mailing list
[email protected]
http://emergent.brynmawr.edu/mailman/listinfo/pyro-users



_______________________________________________
Pyro-users mailing list
[email protected]
http://emergent.brynmawr.edu/mailman/listinfo/pyro-users

Reply via email to