@"sstein...@gmail.com" <sstein...@gmail.com>
"See this article for some more info about the reported sizes of things:
http://www.doughellmann.com/PyMOTW/sys/limits.html";

I posted this question on stack overflow. I now have a better appreciation
of ssteinerX suggestions of the above link and guppy, I also had pympler
suggested.

http://stackoverflow.com/questions/2306523/reading-text-files-into-list-then-storing-in-dictionay-fills-system-memory-a


Thanks again


  *Vincent Davis
720-301-3003 *
vinc...@vincentdavis.net
 my blog <http://vincentdavis.net> |
LinkedIn<http://www.linkedin.com/in/vincentdavis>


On Sat, Feb 20, 2010 at 7:15 PM, sstein...@gmail.com <sstein...@gmail.com>wrote:

>
> On Feb 20, 2010, at 8:05 PM, Vincent Davis wrote:
>
> Code is below, The files are about 5mb and 230,000 rows. When I have 43
> files of them and when I get to the 35th (reading it in) my system gets so
> slow that it is nearly functionless. I am on a mac and activity monitor
> shows that python is using 2.99GB of memory (of 4GB). (python 2.6 64bit).
> The getsizeof() returns 6424 bytes for the alldata . So I am not sure what
> is happening.
>
>
> See this article for some more info about the reported sizes of things:
> http://www.doughellmann.com/PyMOTW/sys/limits.html
>
> S
>
>
-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to