> From: tim.pet...@gmail.com > Date: Sun, 10 Jan 2016 10:54:10 -0600 > Subject: Re: [Tutor] Question about the memory manager > To: sjeik_ap...@hotmail.com > CC: tutor@python.org > > [Albert-Jan Roskam <sjeik_ap...@hotmail.com>] > > I just found a neat trick to free up an emergency stash of memory in > > a funtion that overrides sys.excepthook. The rationale is that all > > exceptions, including MemoryErrors will be logged. > > The code is below. My question: is that memory *guaranteed* to be > > freed right after the 'del' statement? Or should one call gc.collect to > > be really sure? > > > > rainydayfund = [[] for x in xrange(16*1024)] # or however much you need > > def handle_exception(e): > > global rainydayfund > > del rainydayfund > > ... etc, etc ... > > http://stackoverflow.com/questions/1235349/python-how-can-i-handle-any-unhandled-exception-in-an-alternative-way > > This works fine in all versions of CPython (the C implementation of > Python distributed by python.org) to date. That's because: > > 1. All versions of CPython rely primarily on reference counting (`gc` > is only needed to reclaim garbage containing reference cycles). An > object is released immediately when its reference count falls to 0. > > 2. There is only one reference to the big list there (via the global > `raindydayfund`), so the memory becomes garbage immediately upon > executing the `del`. > > 3. Similarly, that giant list holds the only references to the masses > of distinct empty lists it contains, so they also become garbage > immediately upon the giant list becoming garbage. > > 4. CPython doesn't happen to stick garbage lists in, e.g., some > internal free list reusable only for new list objects - it actually > releases the memory for garbage lists. Kinda ;-) > > #2 and #3 are necessarily true. #1 is true in CPython, but not in all > implementations of Python. > > #4 is where things _might_ change even in CPython, but it's very > unlikely to change. As is, it would take a small book to flesh out > what "Kinda ;-)" means, exactly. Memory management is complex, with > many layers, involving many details. > > If you can live with all that, I'd suggest a more straightforward way > of setting it up, like: > > rainydayfund = b"x" * N > > where `N` is the number of bytes you want to reserve. That is, create > a giant bytestring containing the number of "emergency bytes" you > need. If N is large enough, that will avoid CPython's "small object > allocator" and CPython's "arena allocator", getting the memory > directly from (and returning the memory directly to) the OS. The > fewer layers that get involved, the fewer layers that _may_ surprise > you by changing behavior in the future.
Hi Tim, Thank you! Can you recommend a document or website that CPython's memory manager? Might be interesting and perhaps useful to know a bit more about the details. Perhaps this knowledge might sometimes help writing faster code? Best wishes, Albert-Jan PS: albertjan@debian:~$ python -c "import this" The Zen of Python, by Tim Peters ... ... There should be one-- and preferably only one --obvious way to do it. Although that way may not be obvious at first unless you're Dutch. --> Nope not even then. Perhaps if my name were Guido? :-) _______________________________________________ Tutor maillist - Tutor@python.org To unsubscribe or change subscription options: https://mail.python.org/mailman/listinfo/tutor