On Thu, May 20, 2004 at 05:36:31PM +0900, Charlie Root wrote:
On Thu, May 20, 2004 at 01:42:00AM -0500, Dan Nelson wrote:
In the last episode (May 20), Till Plewe said:
My problem is essentially that freeing large numbers of small chunks
of memory can be very slow. I have run into this
On Thu, May 20, 2004 at 09:28:12AM -0400, Chuck Swiger wrote:
Till Plewe wrote:
My problem is essentially that freeing large numbers of small chunks
of memory can be very slow. I have run into this problem twice so far.
[ ... ]
One solution would be to divide the memory in larger regions and
In the last episode (May 20), Till Plewe said:
My problem is essentially that freeing large numbers of small chunks
of memory can be very slow. I have run into this problem twice so
far.
Do you have a testcase? The attached program mallocs 1 million
128-byte blocks, then frees them. With
On Thu, May 20, 2004 at 01:42:00AM -0500, Dan Nelson wrote:
In the last episode (May 20), Till Plewe said:
My problem is essentially that freeing large numbers of small chunks
of memory can be very slow. I have run into this problem twice so
far.
Do you have a testcase? The attached
Till Plewe wrote:
My problem is essentially that freeing large numbers of small chunks
of memory can be very slow. I have run into this problem twice so far.
[ ... ]
One solution would be to divide the memory in larger regions and to
tell malloc which chunk to use for the next few calls,
My problem is essentially that freeing large numbers of small chunks
of memory can be very slow. I have run into this problem twice so far.
1) Shutting down python can take several minutes if I have used large
dictionaries. The solution I use here is to exit python without
freeing the allocated