Hi Rik,

Very stimulating and well-expressed thoughts there. While I agree that
there is no "clear blue water" separating "computers" and "memory systems"
(each defined in their traditional sense, importantly), I don't think the
HTM quite fits into that spectrum. Let me explain.

The classic memory-performance tradeoffs you describe involve the decision
only as to *when* the computation takes place, not the manner of
computation. So, for example, you can perform your sine calculation for
each required value either when you need it, or else do them all beforehand
and store them in a lookup table. In both cases the sine is calculated for
each value in an identical fashion, using some equivalent of a Turing
machine.

In the HTM, however, there is *never* a calculation, only a memory. The
only things which you could consider "calculations" are transformations and
combinations of memories and perceptions, but these do not resemble the
calculations or logic of a von Neumann computer at all. The only "computing
elements" in HTM are the individual neurons, which produce activity states
and prediction states based on combining feedforward, local excitatory and
inhibitory influences in a manner described in the CLA.

The result is that, even in NuPIC's tiny fragment of neocortex, we have
hundreds of millions of connections in a non-linear circuit, using 60,000
"nano-computers" simultaneously processing the streaming data, using and
then updating the hundreds of millions of permanence values, in a cyclical,
complex, and non-linear feedback system. While technically possible, it is
pointless to analyse this system at the level of all the numbers in the
model, as opposed to studying it as a system which produces stable
recognition, emergent learning, predictive and anomaly detecting
functionality.

Considering that NuPIC is a drastic simplification of the real neocortex,
the real brain as a "memory system" falls way outside the gamut of your
other examples. Interestingly, it is the failure of "expert systems",
agent-based systems etc in the field of Artificial Intelligence that has
prompted Jeff's foray into the investigation of the brain as a totally
non-computing memory system.

A very simple example of how we (very expensively, unreliable and slowly)
actually do calculations: by learning, combining and generating sequence
memories.

To add 13 and 18:

Recall the sentence "Add the rightmost digits".
Read the numbers, learn (short-term sequence memory) the phrase "3 and 8".
Remember the (long-term memory) sentence "3 and 8 is 11"
Recall the sentence "Write down the rightmost digit" = ???1
Recall the sentence "Carry the other digit if 10 or higher" (a 1)
Recall the sentence "Add the rightmost remaining digits (plus the carry)"
... Recall 1 + 1 is 2 + 1 (carry) is (remember) 3  [this involves even more
steps than the ones column]

Answer is 31.

The above involves millions of neurons and probably hundreds of sequence
memories. It also involves many, many levels of hierarchy. That's ignoring
all the visual system processing needed to even read the question, the
audio memory needed to verify that the tables recollection "sounds right",
and that of the "hearing yourself think" in order to store the short-term
items, the attention-directing circuitry needed to orchestrate the learned
and practised sequences for performing addition, and so on.

Contrast the above with the performance, not of computers, but of geniuses
of mental arithmetic, capable of multiplying 15 digit numbers in seconds.
These savants do not follow the same program as the rest of us - they have
instead practised and learned entirely different methods of manipulating
long sequences of digits, usually by effectively learning vary large sets
of "times tables" involving 3 or 4 digits at a time, or by some other
non-traditional method. The speed and accuracy they demonstrate could only
be achieved by using sequence memory, not exhaustive computation.

Secondly, you mention the use of SDR-like representations. Indeed, many
applications are now looking at the advantages of more "holographic" and
less dense, noise-tolerant representations. But the key difference with the
CLA is the use of sequence memory to reliably store and process these
representations. If sequences are stored, not directly, but using the
emergent predictive holographic storage used in the CLA, then the SDR's can
not only be recognised with higher certainty (allowing for higher
confidence in their identity) but also can be used to constrain and thus
improve the recognition of information which is distributed over time as
well as over individual SDR's.

Any computation system which involves a) on-line learning of streaming
data, b) SDR's and c) sequence memory is a "memory system" in the sense of
the HTM/CLA, and would have some equivalence in that class to the
neocortex, just as a von Neumann machine is equivalent to a Turing
Universal Computer.

Regards,

Fergal Byrne



On Sun, Oct 13, 2013 at 11:25 AM, Rik <[email protected]> wrote:

> Hi,
>
> HTM "is a memory system, not a computer" says Jeff. I have a problem
> with this purported division of computational systems (whatever the
> umbrella term) into these two categories.
>
> Computer science knows many examples that fall somewhere between the
> two, leading to the suspicion that there is a continuum, and where there
> is a continuum, maybe the distinction turns out to be an implementation
> detail, or entirely superficial? Looking into this may not just be an
> argument about semantics, it may help find areas of application for
> today's early-stage HTMs, an important task. But more on this later.
>
> Let's first look at examples such as:
> - To trade precision for performance, a 'computer' might refrain from
> computing the results of "sin x" for various values of x and instead
> look them up for the value nearest to each x in a table of 1000
> pre-computed values. Memory system or computer?
> - For much added precision at little performance cost, the stored values
> are looked up for the nearest higher and lower neighbors of x in the
> table and then interpolated between the two. Memory system or computer?
> - With memoization, computer programs compute the value of pure
> functions -- especially the non-arithmetic type -- once for each set of
> inputs, then store the outputs in a table for future lookup. Memory
> system or computer?
> - In "logic programming" (think Prolog) programs 'compute' results
> through a series of lookups of stored facts and rules which are then
> processed using some Boolean logic. Memory system or computer?
> - A geospatial database may find the stored latitude/longitude points
> closest to point x by computing the euclidean distance of each to X and
> then selecting the n lowest distances. Memory system or computer?
> - Last but not least, retrieval of data from RAM may be seen as a
> massively parallel computation where each memory cell compares (a
> computation!) its own address to that on the address bus and puts its
> content onto the data bus if they match. This becomes more interesting
> if the 'address' is of a more complex structure (such as SDRs), e.g. see
> the book "Sparse Distributed Memory" (Pentti Kanerva) for this. This
> also turns the common wisdom of a "storage vs. computation trade-off"
> into a "serial vs. parallel computation" trade-off. Is RAM a memory
> system or a computer?
>
> Now let's look at another seeming distinction between HTM memory systems
> and computers. Computers seem to perform computations in steps one after
> another where a program counter is moving from one line of code to the
> next. An HTM on the other hand is a set of connected autonomous elements
> that pass messages between one another. This is assuming you really have
> a hierarchy which the current version of CLA does not implement --
> whether a CLA region in itself can be seen as a set of autonomous
> message-passing elements is t.b.d.
>
> So now at least at this higher 'architectural' level there is a
> distinction? But the program counter view of computer programming has
> been found to be a mistake now. It was probably never a good idea but
> various forces including having to resort to multi-core and distributed
> systems for ever more processing power and the need to express
> conceptually independent temporal sequences, even if later computed
> serially, have given rise to dataflow/reactive/actor-based/etc.
> programming models -- same concept under different names --, that
> implement the HTM view of autonomous communicating elements.
>
> Memory system or computer, still looking different?
>
> Last but not least: HTM 'computes' using SDRs, whereas computers use
> other data types such as integers, floats, strings and records
> (structs). But are computers justified in doing so? Should data in
> computer programs be replaced by SDRs? In at least some cases that's not
> so completely unreasonable. In the paper "Hyperdimensional Computing"
> (Pentti Kanerva, again) shows how at least some ordinary types of data
> and their corresponding computations can be turned into distributed
> representations. This is usually trading off precision for operations in
> constant space and time which may be appropriate for many operations
> currently implemented using traditional data types.
>
> So seeing that there is a sliding scale from memory systems to
> computers, can we actually implement a metaphorical slider that I can
> slide all the way from one end of the spectrum to the other? The above
> mentioned principle of pre-computing values of say, sin x, for later
> lookup can be implemented on a sliding scale from all lookup and no
> computing (i.e. pre-computing *all* values for a desired precision) to
> no lookup and all computing.
>
> Can the other aspects of traditional computer programs as discussed
> above be "HTM-ified", too, in incremental steps, according to
> yet-to-be-found "HTM-ification recipes"? Such as adding memoization or
> logic-type lookup, then transforming, piece by piece, the 'imperative'
> program counter type code into a dataflow/reactive model, then replacing
> some of the traditional data types with SDRs?
>
> A historical precedent here is the arrival of digital signal processing
> in an electronic world which before had only been analog. However no one
> went and proclaimed an "analog-digital dichotomy" where the digital
> proponents ignored existing analog devices and only interfaced with the
> analog world at the edges. Instead, existing analog devices were
> digitized step by step by replacing analog components and concepts with
> digital ones as they were invented, analog and digital parts tightly
> integrated and concepts coexisting peacefully. The TV industry is only
> now throwing the last analog concepts out of TVs, such as "tuning" into
> a channel, as TV signals are going all-digital.
>
> The expectation of such an "HTM-ification" is that this would open up
> possibilities to include HTM-like qualities such as learning and
> prediction in traditional computer systems. Possibilities that were
> there all along but not evident. Let's say, your hotel reservation
> system become an occupancy prediction system. Finding applications for
> today's toy-sized HTMs is a Hard Problem, both due to the counter
> intuitive nature of the HTM as a tool (intelligent but not human-like)
> and a suspected sort of Stockholm syndrome in software development where
> the toolset available has shaped the mindset. HTM-ification of computer
> programs would make these applications evident.
>
> All of this is speculation at this point but I work on projects on a
> day-to-day basis, mostly high-redundancy network applications, that are
> implemented as reactive hierarchies, prefer constant memory/computation
> cost over precision, work more on lookups than arithmetic computations,
> cache ("memorize") data etc. and as such seem to be easy targets for
> HTM-ification. This work is in progress.
>
> Comments?
>
> Rik
>
> _______________________________________________
> nupic mailing list
> [email protected]
> http://lists.numenta.org/mailman/listinfo/nupic_lists.numenta.org
>



-- 

Fergal Byrne

<http://www.examsupport.ie>Brenter IT
[email protected] +353 83 4214179
Formerly of Adnet [email protected] http://www.adnet.ie
_______________________________________________
nupic mailing list
[email protected]
http://lists.numenta.org/mailman/listinfo/nupic_lists.numenta.org

Reply via email to