Hi Roberto,

Yes, those are called automatically. With most real data we have seen speed
does slow down initially as it learns the patterns but the rate plateaus
after a while. The default decrement value is sufficient to handle these
noise cases.  Pure random sequences are the worst case - you might need a
higher permanence decrement value.

We have run models for months at a time, but in those cases we tend to feed
in one data point every 5 minutes. What you are doing is a nice stress test
of the system. Perhaps it will expose some problems that need to be fixed!

--Subutai


On Thu, Dec 3, 2015 at 8:06 PM, Roberto Becerra <[email protected]>
wrote:

>
> Hi! and thanks all for your comments,
>
> So, to recap, it is the randomness of the input that makes the system
> create more connections. There is a way to "Updates synapses on segment."
> and "Strengthens active synapses; weakens inactive synapses" I wonder if
> this is called automatically. I would think so, to avoid this clogging.
>
> Still my opinion on this is that all real data will have a fair amount of
> noise/randomness, and so it seems like this decrease in execution speed
> would be more like the norm.
>
> It looks like this is going deeper down the rabbit hole! I appreciate the
> sharing of knowledge, now I will have to go into these readings.
>
> Cheers!
>
> Roberto Becerra
> https://iobridger.wordpress.com/
>
>
>
>
> > Date: Thu, 3 Dec 2015 22:23:59 -0500
> > Subject: Re: Model Run Slowing Down
> > From: [email protected]
> > To: [email protected]
>
> >
> > yes that is one of the documents I currently read. I have to be very
> > careful because there were alot of stuff I missed on the first read,
> > so I'm rereading.
> > -------| http://ifni.co
> >
> >
> > On Thu, Dec 3, 2015 at 5:22 PM, Matthew Taylor <[email protected]> wrote:
> > > Have you read this yet? It should provide a very long-winded answer to
> your
> > > question about synapses.
> > >
> > > http://arxiv.org/pdf/1511.00083.pdf
> > >
> > > ---------
> > > Matt Taylor
> > > OS Community Flag-Bearer
> > > Numenta
> > >
> > > On Thu, Dec 3, 2015 at 12:45 PM, mraptor <[email protected]> wrote:
> > >>
> > >> >
> > >> The model keeps trying to learn sequences and is creating an
> > >> increasing number of synapses and segments containing the new random
> > >> transitions that it sees.
> > >> >
> > >>
> > >> What is the mechanism/algorithm for creating/destroying segments ? And
> > >> the "attachment" of synapses !
> > >> Couldn't find explanation for this process in the docs.
> > >>
> > >> thanks
> > >>
> > >> -------| http://ifni.co
> > >>
> > >>
> > >> On Thu, Dec 3, 2015 at 2:25 PM, Subutai Ahmad <[email protected]>
> wrote:
> > >> > Roberto,
> > >> >
> > >> > This is not unexpected if you are feeding in random data all the
> time.
> > >> > The
> > >> > model keeps trying to learn sequences and is creating an increasing
> > >> > number
> > >> > of synapses and segments containing the new random transitions that
> it
> > >> > sees.
> > >> > If you feed in more predictable data (e.g. self.amplitude =
> > >> > (self.amplitude
> > >> > + 1)%200 ) you should not see such a large increase in time. If you
> > >> > still
> > >> > see a big increase with predictable data then there might indeed be
> some
> > >> > memory issue.
> > >> >
> > >> > --Subutai
> > >> >
> > >> > On Thu, Dec 3, 2015 at 9:34 AM, Roberto Becerra
> > >> > <[email protected]>
> > >> > wrote:
> > >> >>
> > >> >> Hi community!
> > >> >>
> > >> >> So, I have built a very simple script just to try the speed of
> > >> >> execution
> > >> >> of NuPIC, because I am seeing that it slows down a lot after a few
> > >> >> hours of
> > >> >> execution, I wonder if you have observed this or have any comments
> on
> > >> >> something weird I might be doing. The script goes like this:
> > >> >>
> > >> >> self.model = ModelFactory.create(model_params.MODEL_PARAMS)
> > >> >> self.model.enableInference({'predictedField': 'binAmplitude'})
> > >> >> self.likelihood= AnomalyLikelihood()
> > >> >> self.startTime = time.time()
> > >> >> while True:
> > >> >> self.amplitude = random.randint(0,200)
> > >> >> self.result = self.model.run({"binAmplitude" :
> > >> >> self.amplitude})
> > >> >> self.anomaly = self.result.inferences['anomalyScore']
> > >> >> self.likelihood =
> > >> >> self.likelihood.anomalyProbability(self.amplitude, self.anomaly)
> > >> >> print 'Loop Period: ' + format(time.time() - self.startTime)
> > >> >> self.startTime = time.time()
> > >> >>
> > >> >> It is creating one model and running forever with random inputs. In
> > >> >> the
> > >> >> beginning the Loop Period is around 0.01 seconds, or 100 Hz, but as
> > >> >> time
> > >> >> goes on (I left it running over night) the period increased to
> values
> > >> >> that
> > >> >> are not constant, but reaching up to 4 secods, 10 seconds or even
> 128
> > >> >> seconds!
> > >> >>
> > >> >> I am running quite a limited computer, but I don“t think this is
> the
> > >> >> cause, maybe some memory leak? or resources that are available for
> > >> >> python?
> > >> >>
> > >> >> OSX El Capitan (but it was happening in Yosemite as well)
> > >> >> Mac Mini Intel Core 2 Duo 2.0GHz , A1283 2GB 250GB
> > >> >>
> > >> >> What do you think of this? Thanks!
> > >> >>
> > >> >> Roberto Becerra
> > >> >> https://iobridger.wordpress.com/
> > >> >
> > >> >
> > >>
> > >
> >
>

Reply via email to