I don't think it has anything to do with the size of the data (input) but
rather the size of the region(s) you're trying to serialize.  For example,
I had a SP region with 12288 columns encoding an SDR that was probably 40k
bits.  I think if your SP/TP is big enough, you can just feed it 1 record
and it will fail when saving.


On Thu, Aug 21, 2014 at 9:50 AM, Marek Otahal <[email protected]> wrote:

> I think python 3 is not realistic for us now.
>
> what they suggest is to pickle at smaller parts, and we do have a code for
> that
> "workaround is to pickle your array yourself in smaller parts, or use
> some other file format than Python pickles."
>
> if you can make a test-case with large data, it would be helpful.
>
> Cheers,
>
> _______________________________________________
> nupic mailing list
> [email protected]
> http://lists.numenta.org/mailman/listinfo/nupic_lists.numenta.org
>
>
_______________________________________________
nupic mailing list
[email protected]
http://lists.numenta.org/mailman/listinfo/nupic_lists.numenta.org

Reply via email to