Look at Temporal Delay Line Neural Networks

On 5/6/2015 1:16 PM, Valentin Puente wrote:


    Repetitions are significant in the sequence. Remember, we're not
    "calculating", we're simply activating columns and cells in a
    pattern; reinforcing affinities of connections - not doing
    operations which yield a "final result". We're modeling neural
    circuitry not building an equivalent formula calculator? It takes
    some getting used to :-)

    Actually, the implementation is *totally* event-driven. If there
    are no inputs, nothing happens! :-)


Yes. The problem appears with repetitions of the same input value. I was thinking not doing computations but doing predictions. To do so, we need to predict both ... "amplitude" and "duration" of each step in the signal. But when you try to put both in the "same place", something wrong might happens. Intuitively, to create synapses to segments of cells in the same column (which is required to predict multiple repetitions of the same value) looks inefficient. Perhaps, the "sense of time" should be "stored" in somewhere else?

The implementation is very pythonic (simple and really elegant!) :-) Nevertheless, when I was talking about event-driven, I was thinking into "chain" events in time like hardware simulators usually do (i.e. for example if one cell is predicted active, schedule the callback in a event queue to do the learning in "t+1" for that particular cell... or better... when the input change :-). Perhaps that way it could be possible to avoid some overhead in constructors, destructors, iterators, etc... certainly at expenses of code clarity.

Thanks (and once again, sorry for my noobness :-)
--
vpuente



Reply via email to