David thank u for the detail explanation.

On 2 February 2015 at 13:25, cogmission1 . <[email protected]>
wrote:

> Dinesh,
>
> First of all when you say NuPIC, it helps to be clear about what in fact
> that is. NuPIC had a list of components and can be accessed and utilized
> from a few different levels of abstraction. There is the OPF (Online
> Prediction Framework) which is the "highest" level with which to work. Then
> there is the Network API, which allows one to combine components with
> specific respect for the job one wants to get done. Then there is working
> with the individual components which is the most fine grained and requires
> the most technical know-how (most do not deal with this level).
>
> NuPIC is a framework that in the end delivers a prediction and a
> confidence level of that prediction, given a *stream* of input (or sequence
> of inputs). It consists of various lower level components.
>
> 1. A module that allows Swarming (an algorithmic approach to arriving at
> (converging on) the best configuration parameters to use with NuPIC. (I
> believe it hasn't yet been pulled out so that it can be used independent of
> the OPF, but that is a current effort - I may be wrong or not up to date)
>
> 2. Encoders. Take input of various forms and normalize it across all
> possible inputs of a given type to produce a representation which is
> suitable for input into other NuPIC components. This input adheres to a few
> different constraints. Some of these constraints are configurable such as
> how many bits to use for a representation, how many bits within that are to
> be "on" bits etc.
>
> 3. Spatial Pooler. This takes in an encoded bit vector and outputs an SDR
> (Sparse Distributed Representation). This SDR is guaranteed to have certain
> properties. It also represents other aspects being modeled in a biological
> way. This is namely the cortical columns which contain neurons. SDR's have
> certain properties such as consistency, resistance to noise (slight
> differences in the resultant bit vectors don't have an enormous relative
> effect on the semantics) etc.
>
> 4. Temporal Memory. This, in most cases takes an SDR processed by a
> Spatial Pooler - but doesn't have to in all cases depending on the task
> being accomplished. This provides the distinctions surrounding prediction
> and sequential learning. This also outputs an SDR.
>
> 5. Classifier. This is used to provide statistical correlation of a
> prediction to the input that caused it. It is in most cases added to to the
> final layer of processing (but doesn't always have to be used).
>
> The site at Numenta.org has wikis and links to videos[1][2][3] which
> explain and add knowledge about each of these components. This is just a
> rough overview, it would help to peruse these and then come back with more
> questions which are always welcome :)
>
> Hope this high level overview helps!
>
> Welcome aboard!
>
> David
>
> 1. http://www.numenta.org
> 2. https://github.com/numenta
> 3. https://github.com/numenta/nupic/wiki
>
> On Mon, Feb 2, 2015 at 1:15 AM, Dinesh Deshmukh <[email protected]>
> wrote:
>
>> I would like to know in depth about different modules of nupic.GIve me
>> some suggestions of any links that explains how nupic code flows.
>> I know abstract view of what htm is but i want to understand at a
>> programming level.
>>
>> What makes nupic about 600MB size? I mean what kind of features it has?
>>
>> Thank u all.
>>
>
>
>
> --
> *We find it hard to hear what another is saying because of how loudly "who
> one is", speaks...*
>

Reply via email to