+1 - This is correct and was previously implemented this way under the
title 'reconstruction.' The classifier is an aberration that should be
removed, or moved off the mainline path.

The classifier was implemented to improve product results and to provide a
fast way to do multistep prediction.

Ian


On Fri, Nov 22, 2013 at 9:41 AM, Marek Otahal <[email protected]> wrote:

> In the following text I'll describe what  (I think) Classifier does,
> why I consider it wrong and what can be done to avoid it?
>
> 1/ What is a Classifier:
> wiki doesn't say much, I got it's a part of Region that "translates SDRs
> back to input-space".
>
> Looking at the code at py/nupic/algorithms/CLAClassifier.py (and its c++
> sibling) I see there's a function compute() that basicaly pairs SDR (from
> lower layer, with input that caused the sdr), am I right?
>
> *) there's a KNNClassifier which would use k-nearest neighbours alg., and
> CLAClassifier..which uses what? a SP? A feed-forward NN seems to be a good
> candidate for such impl. of a classifier.
>
>
>
> 2/ Why I consider it "wrong"
>
> 2.1/ Classifier does not have a biological counterpart like other parts of
> HTM/CLA do.
>
> the chain :
>  input --> encoder --> SP --> TP --> Classifier??!
>
> input can be whatever we have sensors to percieve, eg a "sound wave", it's
> of any possible data-type - input-space
>
> encoder is the function of the sensory organ - eg "cochlea translates the
> vibrations to the electromagnetic pulses on the cells ", it translates from
> inputspace to bit-vector (not a SDR, however)
>
> SP+TP: are combined together in the brain in a (micro-)region;
> they both accept and produce a SDR
>
> The Classifier does not have a counterpart as brain has no need to
> translate back to the input-space. We, however, do for CLAs to be useful in
> practical problems.
>
>
> 2.2/ lack of top-down compute in SP, TP breaks modularity.
>
> Encoder does have encode() and decode()/topDownCompute() methods. SP and
> TP dont. To make these two useful building blocks, it would be necessary to
> have the an inverse of compute() in these too.
>
> In nature, there are four types of connections between cells/neurons in
> brain: 2 vertical: feedforward feeding input to higher layer, and recurrent
> feeding more stable patterns to lower layers.
>
> And two horizontal: predictive connections (used in TP) and inhibitory
> (missing in Nupic).
>
> The inhibition connections are missing for performance reasons (I think)
> and we use global/local n-best inhibition instead. This fact makes it
> impossible to recostruct from a list of active columns (an SDR) the input
> that had caused it. If we had them (inh. permanences) we could use them in
> reverse meaning to boost the columns that have been silenced and from these
> "active+" columns according to permanences turn ON appropriate synapses.
>
>
>
> Such implementation would be slower, but interesting for bio-inspired
> research; allow SDR to be messaging format between parts of CLA (even diff
> implementations) and reduce possible space for errors happening in the
> Classifier.
>
> Are my thoughts proper/wrong somewhere? What do you think?
>
> Cheers, Mark
>
>
> --
> Marek Otahal :o)
>
> _______________________________________________
> nupic mailing list
> [email protected]
> http://lists.numenta.org/mailman/listinfo/nupic_lists.numenta.org
>
>
_______________________________________________
nupic mailing list
[email protected]
http://lists.numenta.org/mailman/listinfo/nupic_lists.numenta.org

Reply via email to