[issue26251] Use "Low-fragmentation Heap" memory allocator on Windows

2018-05-29 Thread STINNER Victor


STINNER Victor  added the comment:

I failed to find the bandwidth to work on this issue since 2 years, so I just 
abandon this idea. However the performance benefit is non obvious.

--
resolution:  -> out of date
stage:  -> resolved
status: open -> closed

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue26251] Use "Low-fragmentation Heap" memory allocator on Windows

2017-07-03 Thread Steve Dower

Steve Dower added the comment:

I wouldn't be opposed to seeing it tried again, but I have no strong opinion. I 
don't think this is a major performance bottleneck right now.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue26251] Use "Low-fragmentation Heap" memory allocator on Windows

2017-07-03 Thread STINNER Victor

STINNER Victor added the comment:

Steve: "We tried it at one point, but it made very little difference (...)"

Ok. Can I close the issue?

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue26251] Use "Low-fragmentation Heap" memory allocator on Windows

2017-06-28 Thread Steve Dower

Steve Dower added the comment:

We tried it at one point, but it made very little difference because we don't 
use the Windows heap for most allocations. IIRC, replacing Python's optimised 
allocator with the LFH was a slight performance regression, but I'm not sure 
the benchmarks were reliable enough back then to be trusted. I'm also not sure 
what optimisations have been performed in Windows 8/10.

Since the LFH is the default though, it really should just be a case of 
replacing Py_Malloc with a simple HeapAlloc shim and testing it. The APIs are 
nearly the same (the result of GetProcessHeap() will be stable for the lifetime 
of the process, and there's little value in creating specific heaps unless you 
intend to destroy it rather than free each allocation individually).

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue26251] Use "Low-fragmentation Heap" memory allocator on Windows

2017-06-27 Thread STINNER Victor

STINNER Victor added the comment:

Is there anyway interested to experiment to write such change and run 
benchmarks with it?

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue26415] Fragmentation of the heap memory in the Python parser

2016-06-07 Thread A. Skrobov

A. Skrobov added the comment:

My current patch avoids the memory peak *and* doesn't add any memory 
fragmentation on top of whatever is already there.

In other words, it makes the parser better in this one aspect, and it doesn't 
make it worse in any aspect.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue26415] Fragmentation of the heap memory in the Python parser

2016-06-07 Thread STINNER Victor

STINNER Victor added the comment:

Benjamin Peterson: "It seems to me a simpler solution would be allocate all 
nodes for a parse tree in an arena."

Exactly, that's the real fix. Just make sure that deallocating this arena does 
punch holes in the "heap memory". For example, on Linux, it means using mmap() 
rather than sbrk() to allocate memory.


A. Skrobov: "An arena might help reclaim the memory once the parsing is 
complete, but it wouldn't reduce the peak memory consumption by the parser, and 
so it wouldn't prevent a MemoryError when parsing a 35MB source on a PC with 
2GB of RAM."

Parsing a 35 MB source doesn't seem like a good idea :-) I think that it's ok 
to have a memory peak, but it's not ok to not release the memory later.

Do you have a solution avoid the memory peak *and* don't create memory 
fragmentation?

--

___
Python tracker <rep...@bugs.python.org>
<http://bugs.python.org/issue26415>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue26415] Fragmentation of the heap memory in the Python parser

2016-06-07 Thread A. Skrobov

A. Skrobov added the comment:

An arena might help reclaim the memory once the parsing is complete, but it 
wouldn't reduce the peak memory consumption by the parser, and so it wouldn't 
prevent a MemoryError when parsing a 35MB source on a PC with 2GB of RAM.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue26415] Fragmentation of the heap memory in the Python parser

2016-06-06 Thread Benjamin Peterson

Benjamin Peterson added the comment:

It seems to me a simpler solution would be allocate all nodes for a parse tree 
in an arena.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue26415] Fragmentation of the heap memory in the Python parser

2016-06-06 Thread A. Skrobov

A. Skrobov added the comment:

Now that #26526 landed (thanks to everybody involved!), I'm requesting a review 
on an updated version of my patch, which addresses the excessive memory 
consumption by the parser.

The description of my original patch still applies:

> The attached patch for the parser reduces "Maximum resident set size 
> (kbytes)" threefold, for the degenerate example of 'import ast; 
> ast.parse("0,"*100, mode="eval")', by eliminating many CST nodes that 
> have a single child.
>
> According to the comment in Parser/node.c -- "89% of PyObject_REALLOC calls 
> in PyNode_AddChild passed 1 for the size" -- the memory saving should be 
> generally applicable, and not limited just to this degenerate case.

> I've now tried it with "perf.py -r -m", and the memory savings are as follows:
> ...
> on these benchmarks, the saving is not threefold, of course; but still quite 
> substantial (up to 30%).

My new patch updates Modules/parsermodule.c to accept such "compressed" nodes, 
so that everything still builds cleanly and passes the tests.

--
nosy: +benjamin.peterson, berker.peksag, brett.cannon, fdrake, giampaolo.rodola
Added file: http://bugs.python.org/file43261/patch

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue26415] Fragmentation of the heap memory in the Python parser

2016-05-12 Thread A. Skrobov

A. Skrobov added the comment:

Ping? This patch is two months old now.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue26415] Fragmentation of the heap memory in the Python parser

2016-03-10 Thread A. Skrobov

A. Skrobov added the comment:

I've now tried it with "perf.py -r -m", and the memory savings are as follows:

### 2to3 ###
Mem max: 45976.000 -> 47440.000: 1.0318x larger

### chameleon_v2 ###
Mem max: 436968.000 -> 401088.000: 1.0895x smaller

### django_v3 ###
Mem max: 23808.000 -> 22584.000: 1.0542x smaller

### fastpickle ###
Mem max: 10768.000 -> 9248.000: 1.1644x smaller

### fastunpickle ###
Mem max: 10988.000 -> 9328.000: 1.1780x smaller

### json_dump_v2 ###
Mem max: 10892.000 -> 10612.000: 1.0264x smaller

### json_load ###
Mem max: 11012.000 -> 9908.000: 1.1114x smaller

### nbody ###
Mem max: 8696.000 -> 7944.000: 1.0947x smaller

### regex_v8 ###
Mem max: 12504.000 -> 9432.000: 1.3257x smaller

### tornado_http ###
Mem max: 27636.000 -> 27608.000: 1.0010x smaller


So, on these benchmarks, the saving is not threefold, of course; but still 
quite substantial (up to 30%).


The run time difference, on these benchmarks, is between "1.04x slower" and 
"1.06x faster", for reasons beyond my understanding (variability of background 
load, possibly?)

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue26415] Fragmentation of the heap memory in the Python parser

2016-03-09 Thread Serhiy Storchaka

Changes by Serhiy Storchaka :


--
stage:  -> patch review

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue26415] Fragmentation of the heap memory in the Python parser

2016-03-09 Thread A. Skrobov

A. Skrobov added the comment:

The attached patch for the parser reduces "Maximum resident set size (kbytes)" 
threefold, for the degenerate example of 'import ast; ast.parse("0,"*100, 
mode="eval")', by eliminating many CST nodes that have a single child.

According to the comment in node.c -- "89% of PyObject_REALLOC calls in 
PyNode_AddChild passed 1 for the size" -- the memory saving should be generally 
applicable, and not limited just to this degenerate case.

Modules/parsermodule.c is not yet updated to match. Please tell if you want me 
to do that, in case that my proposed change to the parser is acceptable.

--
Added file: http://bugs.python.org/file42101/patch

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue26415] Fragmentation of the heap memory in the Python parser

2016-03-08 Thread A. Skrobov

A. Skrobov added the comment:

@Serhiy: if your build is 32-bit, then every node is half the size, as it 
mostly consists of pointers.

The amount of heap fragmentation can also depend on gcc/glibc version.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue26415] Fragmentation of the heap memory in the Python parser

2016-03-08 Thread Serhiy Storchaka

Serhiy Storchaka added the comment:

On my computer peak memory usage in non-debug build is about 450 MB. Peak 
memory usage in debug build is about 730 MB. I'm not sure this is due to memory 
fragmentation.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue26415] Fragmentation of the heap memory in the Python parser

2016-03-08 Thread STINNER Victor

STINNER Victor added the comment:

libapr library of Apache is designed to group all memory allocations required 
to handle an HTTP request. It is possible to release *all* memory allocations 
of the request at once.

It looks like the the "pool" object:

* https://apr.apache.org/docs/apr/2.0/group__apr__pools.html
* https://apr.apache.org/docs/apr/2.0/group__apr__allocator.html

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue26415] Fragmentation of the heap memory in the Python parser

2016-03-08 Thread STINNER Victor

STINNER Victor added the comment:

My misc notes about memory fragmentation: 
https://haypo-notes.readthedocs.org/heap_fragmentation.html

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue26415] Fragmentation of the heap memory in the Python parser

2016-03-08 Thread STINNER Victor

Changes by STINNER Victor <victor.stin...@gmail.com>:


--
title: Out of memory, trying to parse a 35MB dict -> Fragmentation of the heap 
memory in the Python parser

___
Python tracker <rep...@bugs.python.org>
<http://bugs.python.org/issue26415>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue26251] Use "Low-fragmentation Heap" memory allocator on Windows

2016-01-31 Thread STINNER Victor

STINNER Victor added the comment:

The issue #19246 "high fragmentation of the memory heap on Windows" was 
rejected but discussed the Windows Low Fragmented Heap.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue26251] Use "Low-fragmentation Heap" memory allocator on Windows

2016-01-31 Thread STINNER Victor

STINNER Victor added the comment:

"Low-fragmentation Heap":
https://msdn.microsoft.com/en-us/library/windows/desktop/aa366750%28v=vs.85%29.aspx

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue26251] Use "Low-fragmentation Heap" memory allocator on Windows

2016-01-31 Thread STINNER Victor

New submission from STINNER Victor:

Python has a memory allocator optimized for allocations <= 512 bytes: 
PyObject_Malloc(). It was discussed to replace it by the native 
"Low-fragmentation Heap" memory allocator on Windows.

I'm not aware of anyone who tried that. I would nice to try, especially to run 
benchmarks.

See also the issue #26249: "Change PyMem_Malloc to use PyObject_Malloc 
allocator?".

--
components: Windows
messages: 259293
nosy: haypo, paul.moore, steve.dower, tim.golden, zach.ware
priority: normal
severity: normal
status: open
title: Use "Low-fragmentation Heap" memory allocator on Windows
type: performance
versions: Python 3.6

___
Python tracker <rep...@bugs.python.org>
<http://bugs.python.org/issue26251>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



Re: Heap Memory

2006-11-17 Thread Gregory Piñero
On 11/17/06, Dennis Lee Bieber [EMAIL PROTECTED] wrote:
 The default configuration for WinXP is 2GB shared OS, and 2GB
 process... I believe there is some registry setting that can change that
 to 1GB/3GB.

I did some research and it looks like it does apply to XP
(http://support.microsoft.com/kb/833721#)

However it does say:
A program must be designed to take advantage of the additional memory
address space.

I don't know what that means.  However this blog comment
(http://blogs.msdn.com/oldnewthing/archive/2004/08/05/208908.aspx#209332)
suggests:
Just flipping /3GB isn't enough for most programs. It has the effect
on the kernel, true, but unless your process's executable has the
Large Address Space Aware flag set, Windows won't actually give you
the full 3GB space. Link your executable with /LARGEADDRESSAWARE or
use EDITBIN. ...

Is this something that has to be done when building Python, was it
already done by default,  or can my Python script affect this somehow?
 I'd like to use the /3GB switch if possible.

Finally a side question:
How does Python use memory on a 64 bit OS?  Is there a lot more
available to it by default?

-Greg
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Heap Memory

2006-11-17 Thread Fredrik Lundh
Gregory Piñero wrote:

 How does Python use memory on a 64 bit OS?  Is there a lot more
 available to it by default?

as we've already said a couple of hundred times in this thread, Python 
uses *all* the memory it can get from the operating system.  no more, no 
less.

(the link I posted yesterday was about a *BIG* 64-bit machine, with lots 
of gigabytes of RAM, and a lot more swap space.  that doesn't help much 
when when someone's set the per-process memory limit to 128 megabytes ;-)

/F

-- 
http://mail.python.org/mailman/listinfo/python-list


Heap Memory

2006-11-16 Thread Bugra Cakir

Hi my name is Bugra Cakir,

I have a question. How can we increase heap memory or total memory Python
interpreter
will use in order to avoid memory problems ?
-- 
http://mail.python.org/mailman/listinfo/python-list

Re: Heap Memory

2006-11-16 Thread Gregory Piñero
On 11/16/06, Bugra Cakir [EMAIL PROTECTED] wrote:
 Hi my name is Bugra Cakir,

 I have a question. How can we increase heap memory or total memory Python
 interpreter
 will use in order to avoid memory problems ?

I've wondered the same thing myself.  Even if it turns out it's just
not possible I hope you get an answer.

My completely arbitrary guess is you'd have to recompile Python, but I
hope I'm wrong.

-Greg
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Heap Memory

2006-11-16 Thread Fredrik Lundh
Gregory Piñero wrote:

 I have a question. How can we increase heap memory or total memory Python
 interpreter
 will use in order to avoid memory problems ?
 
 I've wondered the same thing myself.  Even if it turns out it's just
 not possible I hope you get an answer.
 
 My completely arbitrary guess is you'd have to recompile Python, but I
 hope I'm wrong.

what are you guys talking about?  there are no artificial memory 
limitations in Python; a Python process simply uses all the memory it 
can get from the operating system.

/F

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Heap Memory

2006-11-16 Thread Gregory Piñero
On 11/16/06, Fredrik Lundh [EMAIL PROTECTED] wrote:
 what are you guys talking about?  there are no artificial memory
 limitations in Python; a Python process simply uses all the memory it
 can get from the operating system.

I wish I could easily reproduce one of these errors I'm thinking of.
The last time it happened to me, I was pulling about 400,000 records
(10 fields) out of a database table and putting it into a list of
lists.  I then attempted to pickle the resulting huge list to disk and
I just got a big MemoryError.  A similiar thing happened when trying
to write the list to disk using the csv module.

This computer it was running on has 2GB of RAM and 6GB of virtual
memory so I really doubt I had used up all of that memory.  I didn't
watch it the whole time so I can't be sure though.  Any ideas what
could have been going on there?

-Greg
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Heap Memory

2006-11-16 Thread Fredrik Lundh
Gregory Piñero wrote:

 This computer it was running on has 2GB of RAM and 6GB of virtual
 memory so I really doubt I had used up all of that memory.  I didn't
 watch it the whole time so I can't be sure though.  Any ideas what
 could have been going on there?

bogus configuration?

 http://online.effbot.org/2006_10_01_archive.htm#20061004

/F

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Heap Memory

2006-11-16 Thread Marc 'BlackJack' Rintsch
In [EMAIL PROTECTED], Gregory Piñero
wrote:

 This computer it was running on has 2GB of RAM and 6GB of virtual
 memory so I really doubt I had used up all of that memory.

On 32 bit systems the per process limit is usually 2 GiB, no matter how
much physical and virtual memory you have.

And with 2 GiB and 40 objects you have about 5 KiB per object.  If
you pickle that huge list there has to be enough memory to hold the
pickled data in memory before it is written to disk.

Ciao,
Marc 'BlackJack' Rintsch
-- 
http://mail.python.org/mailman/listinfo/python-list

Re: Heap Memory

2006-11-16 Thread Gregory Piñero
On 11/16/06, Thinker [EMAIL PROTECTED] wrote:
 What is your OS? Maybe you should show out the memory usage of your python
 process. In FreeBSD, you should set evironment variable
 'MALLOC_OPTIONS=P' to
 print out usage of malloc()。Maybe you can find a way, in your system,
 to print out usage of the
 heap.

It is Windows XP.  Is there a simliar method for it?  I am curious.

-Greg
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Heap Memory

2006-11-16 Thread Thinker
Gregory Pi鎑ro wrote:
 This computer it was running on has 2GB of RAM and 6GB of virtual
 memory so I really doubt I had used up all of that memory.  I didn't
 watch it the whole time so I can't be sure though.  Any ideas what
 could have been going on there?

 -Greg
   
What is your OS? Maybe you should show out the memory usage of your python
process. In FreeBSD, you should set evironment variable
'MALLOC_OPTIONS=P' to
print out usage of malloc()。Maybe you can find a way, in your system,
to print out usage of the
heap.

-- 
Thinker Li - [EMAIL PROTECTED] [EMAIL PROTECTED]
http://heaven.branda.to/~thinker/GinGin_CGI.py

-- 
http://mail.python.org/mailman/listinfo/python-list