Jeff,

Ah, you're right, I missed that part. I will adjust
the graphs to match your description...

Scott, you're right as well, I fixed that error as well.
Thank you! I am adding some annotations to make
them more clear, and a legend as well. I have time
on the y-axis, although typically that is on the x.
I'm open to ideas if anyone would like to see this
in a different way... I'll post a revision tomorrow.


To answer your question about the increase in
combinations, I understand it like this:

In the current scaler encoder (as I understand it)
we would only get 37 unique combinations or
SDRs from the input range of 0db to 114db.
Anything above or below that would be clipped.

In your proposed system, it would be a statistics
problem that can be described using the
combination formula. The total number of possible
unique combinations of r in the set of n can be found
by using nCr  which is n! / r!(n - r)!

In my example, I have 41 bits with 5 always on
so the total unique combinations of 5 on bits in
a set of 41 total bits where n=41 and r = 5 is

 41! / 5!36! = 749398

unless I'm not seeing this correctly, this is a huge
increase in the number of SDRs that can be made
with the same number of bits and the range is
also increased dramatically.

My example is not 2% sparse, but running the
math on your numbers is difficult but will no doubt
show a large increase. 500! / 20!480!  My calculator
can't handle these large numbers, but I'm sure you
will get much more than 481 combinations of 20 on
bits from a 500 bit pool.

Am I seeing this correctly? Is this huge increase in
combinations going to saturate the CLA? This is
the area where I got stuck in my own research, finding
that the number of spatial patterns exploded as I moved
up the hierarchy, and having not discovered the idea
of using sparseness to simplify things, I couldn't see
a solution. I'm trying to imagine the impact this new
encoder will have. Are you thinking of allowing the
encoder to generate all possible combinations or
just a subset of them? If so, how do you see this playing
out? If not, how many should it allow?

Thanks!  : )




Patrick







On Oct 31, 2013, at 3:24 PM, Jeff Hawkins wrote:

> Patrick,
> Yes, I think you have the idea captured in these very nice images.
> 
> As I understand the images, the x-axis is a list of encoding bits.  The
> y-axis shows encodings of successively increasing scalar input values.
> 
> In my original email, at the end, I suggested that in hind sight we could
> have just picked encoder bits randomly.  As you say, this would be cleaner
> and eliminate the strangeness of starting assigning coding bits one way and
> then switching to a different, random, method.  This would also reduce the
> possibility of having some edge condition as we switch from the first method
> to the second.
> 
> I don't see why this would increase the number of possible combinations.
> Jeff

_______________________________________________
nupic mailing list
[email protected]
http://lists.numenta.org/mailman/listinfo/nupic_lists.numenta.org

Reply via email to