@sstein...@gmail.com sstein...@gmail.com
See this article for some more info about the reported sizes of things:
http://www.doughellmann.com/PyMOTW/sys/limits.html;
I posted this question on stack overflow. I now have a better appreciation
of ssteinerX suggestions of the above link and guppy, I
Code is below, The files are about 5mb and 230,000 rows. When I have 43
files of them and when I get to the 35th (reading it in) my system gets so
slow that it is nearly functionless. I am on a mac and activity monitor
shows that python is using 2.99GB of memory (of 4GB). (python 2.6 64bit).
The
On Sat, Feb 20, 2010 at 5:07 PM, Vincent Davis vinc...@vincentdavis.net wrote:
Code is below, The files are about 5mb and 230,000 rows. When I have 43
files of them and when I get to the 35th (reading it in) my system gets so
slow that it is nearly functionless. I am on a mac and activity
Here is a sample of the output, It almost instantly uses 2GB and then starts
using VMem. This is probably the right suggestion but it's another thing to
install
It's probably also worth being aware of guppy's heapy stuff:
On Sat, Feb 20, 2010 at 5:53 PM, Vincent Davis vinc...@vincentdavis.net wrote:
On Sat, Feb 20, 2010 at 6:44 PM, Jonathan
Gardner jgard...@jonathangardner.net wrote:
With this kind of data set, you should start looking at BDBs or
PostgreSQL to hold your data. While processing files this large