Hi Wakan,

I while back a worked up a very raw example of connecting the algorithms
(in a very rough manner), in Python. Maybe this might be of some help?
(see attached)

Cheers,
David

On Wed, Jan 13, 2016 at 8:59 AM, 박진만 <[email protected]> wrote:

> I did not use Network API.  I made the structure by combining some example
> codes like hello_sp.py and hello_tm.py in
> https://github.com/numenta/nupic/tree/master/examples
>
> I made my midi encoder by combining 3 scalar encoders and 1 category
> encoder. I used FastCLAClassifier in nupic.bindings.algorithms
>
> I want to make an hierarchical structure of HTM, not using network API,
> but using those raw simple codes.
>
> Thank you.
>
> 2016-01-13 23:35 GMT+09:00 Wakan Tanka <[email protected]>:
>
>> Hello,
>> May I ask which framework are you using? OPF or Network API, from what
>> you've typed I guess this is Network API. I'm wondering how you made:
>> raw midi data -> Encoder -> Spatial Pooler -> Temporal Pooler -> CLA >
>> Classifier -> prediction
>> Also what encoder did you use? Have you followed some example codes?
>>
>> Thank you
>>
>> On 01/13/2016 02:47 PM, 박진만 wrote:
>>
>>> Hello, I'm newbie to NUPIC& NUPIC-mailing list.
>>>
>>> I'm working on training midi files(*.mid , a sort of music file) using
>>> low-level codes.
>>>
>>> Low-level codes mean just a raw sp and tp code, not network API or OPF.
>>> I prefer to use low-level codes because it's easier for me to modify the
>>> codes.
>>> I used a simple structure like this:
>>> raw midi data -> Encoder -> Spatial Pooler -> Temporal Pooler -> CLA
>>> Classifier -> prediction
>>> The result was quit awesome. the HTM successfully predicted the whole
>>> sequence with no error.
>>>
>>> Then I wanted to change the structure to be hierarchical like this :
>>> raw midi data -> Encoder -> SP1 -> TP1 -> SP2 -> TP2 -> CLA Classifier
>>> -> prediction
>>>
>>> but I cannot implement the structure because i don't know how the layer
>>> 1 and layer 2 is linked. I already watched "hierarchy_network_demo.py",
>>> but the code just tells us "UniformLink".
>>> What does the term "UniformLink" mean?
>>>
>>> I think it's gonna be a strange architecture if TP1's output(array of
>>> cells) becomes the input of SP2, because in this hierarchy, layer 2 (SP2
>>> and TP2) will have bigger column dimension than those of layer 1, which
>>> is somehow weird.
>>>
>>> to sum up, my questions are :
>>>
>>>  1. How they are linked between layers.
>>>  2. Any hierarchy structure examples in low-level (not Network API, not
>>> OPF)
>>>
>>> Any comments would be very helpful.
>>>
>>> Thank you.
>>>
>>>
>>
>>
>


-- 
*With kind regards,*

David Ray
Java Solutions Architect

*Cortical.io <http://cortical.io/>*
Sponsor of:  HTM.java <https://github.com/numenta/htm.java>

[email protected]
http://cortical.io
'''
Created on Feb 8, 2015

@author: David Ray
'''

import numpy as np

from nupic.encoders.scalar import ScalarEncoder as ScalarEncoder
from nupic.algorithms.CLAClassifier import CLAClassifier as CLAClassifier
from nupic.research.spatial_pooler import SpatialPooler as SpatialPooler
from nupic.research.temporal_memory import TemporalMemory as TemporalMemory


class Layer():
    
    """ Makeshift Layer to contain and operate on algorithmic entities """
    
    def __init__(self, encoder, sp, tm, classifier):
        
        self.encoder = encoder
        self.sp = sp
        self.tm = tm
        self.classifier = classifier
        self.theNum = 0
       
        
    def input(self, value, recordNum, sequenceNum):
        """ Feed the incremented input into the Layer components """
         
        if recordNum == 1:
            recordOut = "Monday (1)"
        elif recordNum == 2:
            recordOut = "Tuesday (2)"
        elif recordNum == 3:
            recordOut = "Wednesday (3)"
        elif recordNum == 4:
            recordOut = "Thursday (4)"
        elif recordNum == 5:
            recordOut = "Friday (5)"
        elif recordNum == 6:
            recordOut = "Saturday (6)"
        else: recordOut = "Sunday (7)"
        
        if recordNum == 1:
            self.theNum += 1
            if self.theNum == 100:
                print "bl"
            
            print "--------------------------------------------------------"
            print "Iteration: " + str(self.theNum)
            
        print "===== " + str(recordOut) + " - Sequence Num: " + str(sequenceNum) + " ====="
        
        output = np.zeros(sp._columnDimensions)
        
        # Input through encoder
        print "ScalarEncoder Input = " + str(value)
        encoding = encoder.encode(value)
        print "ScalarEncoder Output = " + str(encoding)
        bucketIdx = encoder.getBucketIndices(value)[0]
        
        # Input through spatial pooler
        sp.compute(encoding, True, output)
        print "SpatialPooler Output = " + str(np.where(output > 0)[0])
        
        # Input through temporal memory
        input = set(sorted(np.where(output > 0)[0].flat))
        print "input = " + str(input)
        tm.compute(input, True)        
        predictiveCells = tm.activeCells #getSDR(tm.predictiveCells)
        print "TemporalMemory Input = " + str(input)
        
        # Input into classifier
        retVal = classifier.compute(recordNum=0,
            patternNZ=predictiveCells,
            classification= {'bucketIdx': bucketIdx, 'actValue':value},
            learn=True,
            infer=True
        )
        
        print "TemporalMemory Prediction = " + str(getSDR(predictiveCells)) +\
        "  |  CLAClassifier 1 step prob = " + str(retVal[1]) 
        print ""
        

def getSDR(cells):
    retVal = set()
    for cell in cells:
        retVal.add(cell / tm.cellsPerColumn)
    return retVal     
        
def runThroughLayer(layer, recordNum, sequenceNum):
    
    layer.input(recordNum, recordNum, sequenceNum)        
        
        
if __name__ == '__main__':
    encoder = ScalarEncoder(
        n = 8,
        w = 3,
        radius = 0,
        minval = 1,
        maxval = 8,
        periodic = True,
        forced = True,
        resolution = 0
    )
    
    sp = SpatialPooler(
        inputDimensions = (8),
        columnDimensions = (20),
        potentialRadius = 12,
        potentialPct = 0.5,
        globalInhibition = True,
        localAreaDensity = -1.0,
        numActiveColumnsPerInhArea = 5.0,
        stimulusThreshold = 1.0,
        synPermInactiveDec = 0.0005,
        synPermActiveInc = 0.0015,
        synPermConnected = 0.1,
        minPctOverlapDutyCycle = 0.1,
        minPctActiveDutyCycle = 0.1,
        dutyCyclePeriod = 10, 
        maxBoost = 10.0,
        seed = 42,
        spVerbosity = 0
    )
    
    tm = TemporalMemory(
        columnDimensions = (256,),
        cellsPerColumn = (6),
        initialPermanence = 0.2,
        connectedPermanence = 0.8,
        minThreshold = 5,
        maxNewSynapseCount = 6,
        permanenceDecrement = 0.1,
        permanenceIncrement = 0.1,
        activationThreshold = 4
    )
    
    classifier = CLAClassifier(
        steps = [1],
        alpha = 0.1,
        actValueAlpha = 0.3,
        verbosity = 0
    )
    
    sp.printParameters()
    print ""
    
    layer = Layer(encoder, sp, tm, classifier)
    
    i = 1
    for x in range(2000):
        if i == 1:
            tm.reset()
            
        runThroughLayer(layer, i, x)
        i = 1 if i == 7 else i + 1
        
        
        
        
    

        
        
        
        

Reply via email to