On Aug 15, 12:55 pm, Chris Withers <ch...@simplistix.co.uk> wrote:
> Hi All,
>
> I thought this was fixed back in Python 2.5, but I guess not?
>
> So, I'm playing in an interactive session:
>
>  >>> from xlrd import open_workbook
>  >>> b = open_workbook('some.xls',pickleable=0,formatting_info=1)
>
> At this point, top shows the process usage for python to be about 500Mb.
> That's okay, I'd expect that, b is big ;-)
>
>  >>> del b
>
> However, it still does now, maybe the garbage collector needs a kick?
>
>  >>> import gc
>  >>> gc.collect()
> 702614
>
> Nope, still 500Mb. What gives? How can I make Python give the memory its
> no longer using back to the OS?
> [...]

Can you get the same effects without using the xlrd module? I don't
have xlrd installed on my system (OS X 10.5/Intel), but I just tried
the following:

Python 2.6.2 (r262:71600, Jun 17 2009, 09:08:27)
[GCC 4.0.1 (Apple Inc. build 5490)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> b = 'x'*(10**9)
>>> f = open('testfile.txt', 'w')
>>> f.write(b)
>>> del b
>>> f = open('testfile.txt')
>>> b = f.read()
>>> del b

and got the expected memory usage for my Python process, as
displayed by top:  memory usage went up to nearly 1Gb after
each assignment to b, then dropped down to 19 Mb or so after
each 'del b'.  I get similar results under Python 2.5.

So maybe there's something in xlrd that's hanging on to all
that memory?

Mark
-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to