If you use increment and decrement of 0.0, you are essentially using a randomly initialized SpatialPooler. It turns out that such a "random SP" is actually pretty decent. You will get reasonable SDRs out of it. Training will make the SP more resistant to noise.
For MNIST the difference in test accuracy between trained and untrained SP is not large. However it takes a lot longer to train, so I left it out of the code. If you want to use a trained SP, you can try the parameters below. However you will need to go through the training set 3 or 4 times. Separately, I have verified that if you train the SP, the network is much more robust to random noise than an untrained SP. This is different from the normal MNIST testing protocol. --Subutai numInputs = 1024 numColumns = 12288 numActiveColumnsPerInhArea = 1600 potentialPct = 0.4 globalInhibition = 1 stimulusThreshold = 0 synPermActiveInc = 0.001 synPermInactiveDec = 0.0005 synPermConnected = 0.5 minPctOverlapDutyCycles = 0.001 minPctActiveDutyCycles = 0.001 dutyCyclePeriod = 1000 maxBoost = 3 CPP SP seed = 1956 On Tue, Sep 22, 2015 at 7:55 PM, [email protected] <[email protected]> wrote: > Hello, Nupic > Recently, I test nupic.vision project, the result is good but I got one > question. > As we know, the connect value between synapses will be update when we > thain HTM newwork. How do we update the connected value depend on two > parameters: "synPermActiveInc" and "synPermInactiveDec". am I right? > But in run_mnist_experiment.py example,these two parameters is 0. That's > really strange. if we set these parameters to 0, how to traing? it's i > llogical. > is anyone have any explanation or reference material about this > Experiment? > Thank You. > Cyan > ------------------------------ > [email protected] >
