Valentin,

> Perhaps, I have some crazy idea about what is going on.  I think that the
> notion of "t" and "t-1", implicitly asumes a synchronous circuit.
> Nevertheless, biology don't have any clock around...  definitely is
> asynchronous. Under such assumption the previous sequence is not possible,
> since all the repeated values are the same. Therefore, I think that the "t"
> and "t-1" should be redefined as the time where the "input changed". If we
> feed the memory with the same input sequence in t and t-1 something is
> going to be bad at the end.


HTM Theory does not have any real "time" so to speak. We're talking about
sequences, and yes in the biology (I just recently overheard this), there
are "serial" cell/column events. Now, "t-1" refers to the state the
cell/column was left in during the previous activation - cells "depolarize"
making them quicker to fire (and subsequently beat out the race against
inhibitory cell activations); the resulting "depolarization" is what is
modeled as the state in t-1 (AFAIK).

David

On Wed, May 6, 2015 at 9:52 AM, Valentin Puente <[email protected]> wrote:

> I think this problem can be solved trivially just by not breaking ties in
> leastUsedCell() using randoms. I don't think that this makes any difference
> to create the segment in one cell to another (at least from funcional stand
> point) (since all the cells are equally loaded!) leastUsedCell
> leastUsedCell
>
> After figuring out, this I was "fighting" with repetitive sequences (such
> as 000111222000111222....). Any of the temporal memory implementation that
> follows original algorithm (including sensorimotor ideas ... ) wont work at
> all !!! (i.e. the learning wont stabilize the segments never). This is a
> bit shocking ...
>
> Perhaps, I have some crazy idea about what is going on.  I think that the
> notion of "t" and "t-1", implicitly asumes a synchronous circuit.
> Nevertheless, biology don't have any clock around...  definitely is
> asynchronous. Under such assumption the previous sequence is not possible,
> since all the repeated values are the same. Therefore, I think that the "t"
> and "t-1" should be redefined as the time where the "input changed". If we
> feed the memory with the same input sequence in t and t-1 something is
> going to be bad at the end.
>
> The question is how to model "the clock" in the sequence?, i.e. the
> external system is synchronous like in this example).  I think that the
> system need two separate temporal memories: one to predict the next value
> and another to predict when will change. I think this sync-async dichotomy
> should be addressed somehow.
>
> This has e some "sense" to you?  Probably not :-)
> --
> Valentin
>
> 2015-05-04 10:17 GMT+02:00 Valentin Puente <[email protected]>:
>
>> Dear List,
>>
>>
>>
>> I’m playing a little with the algorithms, trying to really understand the
>> details and I’m dealing with a “awkward” problem with the temporal memory.
>> I’m using the latest temporalmemory.py from research folder.
>>
>>
>>
>> Let suppose a cyclic sequence “012012012…”. Under such conditions the
>> system seems to be taking a long time to learn the sequence (with default
>> parameters for the tp, takes around 280 time steps).  The problem we have
>> is that the “first” 0 prevents us to close “the loop” Since the first
>> appearance didn’t have any winner cell, no segment was created. In the
>> second pass with a large number of empty cells, there is a high probability
>> of choose a different cell to connect the “learning” path. This form a
>> snake-alike learning path across all the cells involved in the sequence
>> (with a number of unneeded segments proportional to the cells in the Colum
>> and the learning steps required to reach the threshold in the synapse). In
>> other words, the same value in the sequence are using multiple cells in the
>> Colum. This “orphan” initialization seems to be really serious with more
>> complex cyclic patters.
>>
>>
>>
>> Looks like the second part of the learning is not working correctly.
>> Somehow, the second representation of the “zero” should be connected to the
>> segment created to the first iteration in “1” cells. This in the algorithm
>> is prevented because the first iteration “exhaust” the maxNewSynapseCount
>> and therefore the subsampling will not add any new connection because all
>> previously created synapses are active due to column burst on zero. The
>> first time the “zero” is predicted correctly, the “one” is miss predicted
>> because the segments are connected to the “wrong” (i.e. non winning) zero
>> cells. In such way, the segment is orphan.
>>
>>
>>
>> It looks like that part of the algorithm is a bit “odd”. Intuitively, I
>> think that a segment never should be orphan. We need to connect it to at
>> least a winning cell in the previous step (and create at least a synapse if
>> don’t exist any). My question is… this seems to be pretty basic… ergo
>> should be my fault… Someone else has found similar issues? What is the
>> rationale to chose prevActiveCells and not prevWinnerCells to determine
>> activeSynapses in n = self.maxNewSynapseCount - len(activeSynapses)?
>>
>>
>>
>> Attached to the message is a simple py that shows that effect. I’m using
>> the RDSE encoder (and some really naïve approach ). Sorry if this too naive.
>> --
>> vpuente
>>
>>
>> --------------------------8<-------------------------------------------------8<-------------------------------------------------8<-----------------------
>> #!/usr/bin/env python
>> import random
>>
>> from random import randint
>>
>> from nupic.research.temporal_memory import TemporalMemory as TP
>> from nupic.encoders.random_distributed_scalar import
>> RandomDistributedScalarEncoder as ENC
>>
>> #
>> # Some helpers
>> #
>> def Columns( predictedCell):
>>     """
>>     Get predictive columns
>>     """
>>     predictedCols=set()
>>
>>     for cell in predictedCell:
>>         column = tp.columnForCell(cell)
>>         predictedCols.add(column)
>>     return predictedCols
>>
>> def printSet (setToPrint):
>>     """
>>     print a set whothout brackets
>>     """
>>     for el in setToPrint:
>>         print el,
>>         print ", ",
>>     return
>>
>> def print2sets (value, predicted):
>>     """
>>     print
>>     """
>>     printSet(value)
>>     print" -->",
>>     printSet(predicted)
>>     print ""
>>     return
>>
>> def generateSdr( seed, cols):
>>     """
>>     generates a sdr representation for an integer
>>     """
>>     random.seed(seed*1010101)
>>     repr=set()
>>     for i in range(int(cols*0.02)):
>>        repr.add(randint(0,cols-1))
>>     return repr
>>
>>
>> ##
>> ## ACTUAL CODE
>> ##
>>
>> #Cols
>> numCols=2048
>> seqLength=3
>>
>> enc= ENC(
>>          name='enc', resolution=1, w=21,
>>          n=numCols, offset = 0.0
>> )
>>
>> tp = TP(columnDimensions=(numCols,),
>>         cellsPerColumn=32,
>>         activationThreshold=13,
>>         learningRadius=2048,
>>         initialPermanence=0.21,
>>         connectedPermanence=0.50,
>>         minThreshold=10,
>>         maxNewSynapseCount=20,
>>         permanenceIncrement=0.10,
>>         permanenceDecrement=0.10,
>>         seed=4)
>>
>>
>> #Value initialization
>> sequence=[]
>> seti=set()
>> for i in range(seqLength):
>>   e0=enc.encode(i*10)
>>   index=0
>>   for j in e0:
>>     if j == 1:
>>       seti.add(index)
>>     index=index+1
>>   print seti
>>   sequence.append(seti)
>>
>> repetition=1
>> lastval=set()
>>
>> #Loop
>> for i in range(500):
>>   index=-1
>>   for input in sequence:
>>     index=index+1
>>     for iter in range(repetition):
>>       print "%s ( %s )" % (i,index),
>>       if input == Columns(tp.predictiveCells):
>>          print "ok"
>>       else:
>>          print2sets(input,Columns(tp.predictiveCells))
>>       tp.compute(input, learn =True)
>>       lastval=input
>>
>>
>
>
> --
> --
> vpuente
>



-- 
*With kind regards,*

David Ray
Java Solutions Architect

*cortical.io <http://cortical.io/>*
Sponsor of:  HTM.java <https://github.com/numenta/htm.java>

[email protected]
http://cortical.io

Reply via email to