#14711: Memleak when creating QuadraticField
-------------------------------------------------+-------------------------
Reporter: jpflori | Owner:
Type: defect | davidloeffler
Priority: critical | Status: new
Component: number fields | Milestone: sage-5.13
Keywords: memleak, number field, | Resolution:
QuadraticField | Merged in:
Authors: | Reviewers:
Report Upstream: N/A | Work issues:
Branch: u/SimonKing/ticket/14711 | Commit:
Dependencies: | Stopgaps:
-------------------------------------------------+-------------------------
Comment (by SimonKing):
I have pushed my branch. I wonder why this did not show up as a post on
trac. Anyway, I checked that when clicking on "Commits" then everything is
there.
I hesitated to push it before, because it is not ready for review.
However, it is better that we all agree on what we are talking about.
Replying to [comment:61 nbruin]:
> Too bad. The idea as suggested doesn't actually solve the memory leak;
it just makes it less severe (by a constant factor).
I think this is not the case with the branch that I have just uploaded. I
did
{{{
sage: for D in xrange(2,2**30):
....: print get_memory_usage()
....: Q = QuadraticField(-D)
}}}
First, the memory consumption very slowly raised from 213.60546875 to
214.10546875 and then remained steady for several minutes, until I
interrupted. And I rather think that the increased consumption was just
due to the increased size of D.
> The problem is: The weakened maps don't prevent their domain from being
GCed, but after than happens they linger (now defunct) in `_coerce_from`.
You'll see that even with your patch in, the example in the ticket
description will still eat memory--just a little less quickly. You'll find
that `CDF._coerce_from_hash` will contain a LOT of entries.
This is clearly not the case with the current commit. After running
thousands of cycles in the above "for" loop, I get
{{{
sage: CDF._introspect_coerce()
{'_action_hash': <sage.structure.coerce_dict.TripleDict at 0xa26b25c>,
'_action_list': [],
'_coerce_from_hash': <sage.structure.coerce_dict.MonoDict at 0xa26b3e4>,
'_coerce_from_list': [],
'_convert_from_hash': <sage.structure.coerce_dict.MonoDict at 0xa26b41c>,
'_convert_from_list': [],
'_element_init_pass_parent': False,
'_embedding': None,
'_initial_action_list': [],
'_initial_coerce_list': [],
'_initial_convert_list': []}
sage: gc.collect()
2213
sage: for k,v in CDF._introspect_coerce().iteritems():
if v:
print k, len(v)
....:
_coerce_from_hash 4
_convert_from_hash 3
}}}
> If we were to use a mix of `_coerce_from` and `_coerce_to` (somehow
choosing which one to use) you wouldn't see this problem.
I don't see this problem anyway.
> If we really want/need to, we could probably salvage the "weakened map"
solution:
> - we could install a callback on the weakrefs.
What weakrefs are you talking about? Those to domain and codomain?
And what would the callback be supposed to do?
> - defunct maps are easy to recognize: they have a dead weakref in their
domain. We could just periodically scrub `_coerce_from` for defunct maps.
Well, with the current commit, if a map becomes defunct by some garbage
collection then it is removed from the `_coerce_from_hash` by ''the same''
garbage collection, hence, it will not linger around.
A different story is `_coerce_from_list`, which is a list. Here, we might
need to take care of defunct maps. Perhaps the maps there shouldn't be
weakened in the first place (`_coerce_from_list` is filled by
`register_coercion()`, and perhaps it would make sense to keep these maps
strong), but I am not sure if this wouldn't re-introduce the
`QuadraticField` leak I have just fixed.
> One possible strategy would be to keep a "reference size" for
`_coerce_from` and every time we add an entry we check if it is now double
the reference size. If it is, we trigger gc, scrub, and reset the
reference size.
As I have stated above, I don't think there is a problem with
`_coerce_from_hash` accumulating garbage. But `_coerce_from_list` may
contain garbage, of limited size, because it is only added to by explicit
calls to `register_coercion()`.
I think a better strategy would be to make `discover_coercion` check
whether it meets a defunct map when it performs the backtrack algorithm.
--
Ticket URL: <http://trac.sagemath.org/ticket/14711#comment:62>
Sage <http://www.sagemath.org>
Sage: Creating a Viable Open Source Alternative to Magma, Maple, Mathematica,
and MATLAB
--
You received this message because you are subscribed to the Google Groups
"sage-trac" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/sage-trac.
For more options, visit https://groups.google.com/groups/opt_out.