[issue26773] Shelve works inconsistently when carried over to child processes

2016-04-15 Thread Paul Ellenbogen
New submission from Paul Ellenbogen: If a shelve is opened, then the processed forked, sometime the shelve will appear to work in the child, and other times it will throw a KeyError. I suspect the order of element access may trigger the issue. I have included a python script that will exhibit

[issue26773] Shelve works inconsistently when carried over to child processes

2016-04-17 Thread Paul Ellenbogen
Paul Ellenbogen added the comment: I think this behavior is due to the underlying behavior of the dbm. The same code using dbm, rather than shelve, also throws KeyErrors: from multiprocessing import Process import dbm db = dbm.open("example.dbm", "c") for i in range(100):

[issue36694] Excessive memory use or memory fragmentation when unpickling many small objects

2019-04-21 Thread Paul Ellenbogen
Change by Paul Ellenbogen : Removed file: https://bugs.python.org/file48278/dump.py ___ Python tracker <https://bugs.python.org/issue36694> ___ ___ Python-bugs-list m

[issue36694] Excessive memory use or memory fragmentation when unpickling many small objects

2019-04-21 Thread Paul Ellenbogen
Change by Paul Ellenbogen : Removed file: https://bugs.python.org/file48281/dump.py ___ Python tracker <https://bugs.python.org/issue36694> ___ ___ Python-bugs-list m

[issue36694] Excessive memory use or memory fragmentation when unpickling many small objects

2019-04-21 Thread Paul Ellenbogen
Change by Paul Ellenbogen : Added file: https://bugs.python.org/file48279/load.py ___ Python tracker <https://bugs.python.org/issue36694> ___ ___ Python-bugs-list mailin

[issue36694] Excessive memory use or memory fragmentation when unpickling many small objects

2019-04-21 Thread Paul Ellenbogen
New submission from Paul Ellenbogen : Python encounters significant memory fragmentation when unpickling many small objects. I have attached two scripts that I believe demonstrate the issue. When you run "dumpy.py" it will generate a large list of namedtuples, then write that list

[issue36694] Excessive memory use or memory fragmentation when unpickling many small objects

2019-04-21 Thread Paul Ellenbogen
Change by Paul Ellenbogen : Added file: https://bugs.python.org/file48280/common.py ___ Python tracker <https://bugs.python.org/issue36694> ___ ___ Python-bugs-list m

[issue36694] Excessive memory use or memory fragmentation when unpickling many small objects

2019-04-21 Thread Paul Ellenbogen
Change by Paul Ellenbogen : Added file: https://bugs.python.org/file48282/dump.py ___ Python tracker <https://bugs.python.org/issue36694> ___ ___ Python-bugs-list mailin

[issue36694] Excessive memory use or memory fragmentation when unpickling many small objects

2019-04-21 Thread Paul Ellenbogen
Paul Ellenbogen added the comment: Good point. I have created a new version of dump that uses random() instead. float reuse explains the getsizeof difference, but there is still a significant memory usage difference. This makes sense to me because the original code I saw this issue