#14711: Memleak when creating QuadraticField
-------------------------------------------------+-------------------------
       Reporter:  jpflori                        |        Owner:
           Type:  defect                         |  davidloeffler
       Priority:  critical                       |       Status:  new
      Component:  number fields                  |    Milestone:  sage-5.12
       Keywords:  memleak, number field,         |   Resolution:
  QuadraticField                                 |    Merged in:
        Authors:                                 |    Reviewers:
Report Upstream:  N/A                            |  Work issues:
         Branch:  u/SimonKing/ticket/14711       |       Commit:
   Dependencies:                                 |     Stopgaps:
-------------------------------------------------+-------------------------

Comment (by nbruin):

 Too bad. The idea as suggested doesn't actually solve the memory leak; it
 just makes it less severe (by a constant factor). The problem is: The
 weakened maps don't prevent their domain from being GCed, but after than
 happens they linger (now defunct) in `_coerce_from`. You'll see that even
 with your patch in, the example in the ticket description will still eat
 memory--just a little less quickly. You'll find that
 `CDF._coerce_from_hash` will contain a LOT of entries.

 With the `_coerce_to` alternative also in place, this problem would not
 occur.

 If we really want/need to, we could probably salvage the "weakened map"
 solution:
  - we could install a callback on the weakrefs. Note that the callback
 would have to be delivered to the codomain, so we wouldn't really have
 "weakened maps" as objects on their own: the callback on the weakref would
 make them specific to the data structure in which they reside. There is
 also the usual problem that weakref callback if very critical and we'd
 have to be very careful what we do there.
  - defunct maps are easy to recognize: they have a dead weakref in their
 domain. We could just periodically scrub `_coerce_from` for defunct maps.
 One possible strategy would be to keep a "reference size" for
 `_coerce_from` and every time we add an entry we check if it is now double
 the reference size. If it is, we trigger gc, scrub, and reset the
 reference size. This would at least keep the list bounded in size and
 hopefully limit the number of expensive scrubs we have to do (treating the
 bounds exponentially should lead to small amortized costs)

--
Ticket URL: <http://trac.sagemath.org/ticket/14711#comment:61>
Sage <http://www.sagemath.org>
Sage: Creating a Viable Open Source Alternative to Magma, Maple, Mathematica, 
and MATLAB

-- 
You received this message because you are subscribed to the Google Groups 
"sage-trac" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/sage-trac.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to