Yes, I think so.

On Thu, Aug 21, 2014 at 1:42 PM, Matthew Taylor <[email protected]> wrote:

> Does this capture the issue well enough:
> https://github.com/numenta/nupic/issues/1231 ?
> ---------
> Matt Taylor
> OS Community Flag-Bearer
> Numenta
>
>
> On Thu, Aug 21, 2014 at 7:46 AM, Ryan Belcher <[email protected]> wrote:
> > I don't think it has anything to do with the size of the data (input) but
> > rather the size of the region(s) you're trying to serialize.  For
> example, I
> > had a SP region with 12288 columns encoding an SDR that was probably 40k
> > bits.  I think if your SP/TP is big enough, you can just feed it 1 record
> > and it will fail when saving.
> >
> >
> > On Thu, Aug 21, 2014 at 9:50 AM, Marek Otahal <[email protected]>
> wrote:
> >>
> >> I think python 3 is not realistic for us now.
> >>
> >> what they suggest is to pickle at smaller parts, and we do have a code
> for
> >> that
> >> "workaround is to pickle your array yourself in smaller parts, or use
> some
> >> other file format than Python pickles."
> >>
> >> if you can make a test-case with large data, it would be helpful.
> >>
> >> Cheers,
> >>
> >> _______________________________________________
> >> nupic mailing list
> >> [email protected]
> >> http://lists.numenta.org/mailman/listinfo/nupic_lists.numenta.org
> >>
> >
> >
> > _______________________________________________
> > nupic mailing list
> > [email protected]
> > http://lists.numenta.org/mailman/listinfo/nupic_lists.numenta.org
> >
>
> _______________________________________________
> nupic mailing list
> [email protected]
> http://lists.numenta.org/mailman/listinfo/nupic_lists.numenta.org
>
_______________________________________________
nupic mailing list
[email protected]
http://lists.numenta.org/mailman/listinfo/nupic_lists.numenta.org

Reply via email to