#12313: Fix yet another memory leak caused by caching of coercion data
--------------------------------------------------+-------------------------
Reporter: SimonKing | Owner:
Type: defect | Status:
needs_review
Priority: major | Milestone: sage-5.3
Component: memleak | Resolution:
Keywords: coercion weak dictionary | Work issues:
Report Upstream: N/A | Reviewers: Simon King,
Jean-Pierre Flori, John Perry
Authors: Simon King, Jean-Pierre Flori | Merged in:
Dependencies: #11521, #11599, #12969, #12215 | Stopgaps:
--------------------------------------------------+-------------------------
Comment (by nbruin):
Replying to [comment:177 SimonKing]:
> {{{
> sage: 8./399
> 0.0200501253132832
> sage: 3.5/79
> 0.0443037974683544
> }}}
Not very significant in my opinion, but because this is fundamental
infrastructure it would be worthwhile if someone investigated at some
point. I did some initial test to equip `UniqueRepresentation` and
`UniqueFactory` with a permanent store for anything they produced and the
effect wasn't very big for `sha_tate`. I might be overlooking some other
construction that's contributing. Finding a smoking gun for this
particular regression would be reassuring, but it might not exist.
Perhaps it's best to just profile some of these examples properly and see
what's costing (and perhaps compare profiles?). If it's just the garbage
collector doing its thing, we cannot do much about it. If it's recreating
collected parents all the time, people should redesign that piece of code
to keep the parents.
However, if we're finding that actions take much longer to perform now
because they have to look up weakrefs rather than follow normal refs (or
something similar), perhaps actions need a different way of operating (or
at least an alternative, unsafe path that can be used in time critical
parts)
If we're finding that the system is now spending significantly more time
deleting things because so many more deletions involve swaths of weakref
callbacks, we might want to think of more efficient mechanisms there, or
just accept that this is part of the reasonable cost for memory
management.
I doubt python-level profiling will cut it here. Anybody with good lower
level profiling skills? Can we extract the commands run through sage for
doctesting `sha_tate.py` and compare profiles between 5.3b2 and 5.3b2 +
#715 + #11521 + #12313 ? Both python level and c-level profiles would be
useful.
Again, none of this will hold back a positive review from me and they're
more issues for follow-up tickets.
--
Ticket URL: <http://trac.sagemath.org/sage_trac/ticket/12313#comment:181>
Sage <http://www.sagemath.org>
Sage: Creating a Viable Open Source Alternative to Magma, Maple, Mathematica,
and MATLAB
--
You received this message because you are subscribed to the Google Groups
"sage-trac" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/sage-trac?hl=en.