Hi friends of NuPIC!

I have been running swarms on csv data files that are around 3 megabytes in
size, and I have found that it is using about >6 gigabytes of RAM during
the process.  If I run swarms on data files that are larger than that, my
computer runs out of RAM and hangs (I have 8 GB of RAM).  In particular
I've tried swarming on a 13 MB data file and it froze very quickly.  Memory
usage seems to climb monotonically during the swarming process, and
released all at the end on completion.

I am wondering if anyone has had experience swarming large (>10 MB) csv
files and your experiences with the memory consumption.  Ideally I'd like
to be able to swarm over much larger datasets (on the order of a hundred
megs).

Thanks,

Ritchie Lee
Research Engineer
Carnegie Mellon University-Silicon Valley
NASA Ames Research Center
Bldg 23, Rm 115
Moffett Field, CA 94035
(650) 335-2847
_______________________________________________
nupic mailing list
[email protected]
http://lists.numenta.org/mailman/listinfo/nupic_lists.numenta.org

Reply via email to