#10963: More functorial constructions
-------------------------------------+-------------------------------------
       Reporter:  nthiery            |        Owner:  stumpc5
           Type:  enhancement        |       Status:  needs_work
       Priority:  major              |    Milestone:  sage-5.13
      Component:  categories         |   Resolution:
       Keywords:                     |    Merged in:
        Authors:  Nicolas M. Thiéry  |    Reviewers:  Simon King, Frédéric
Report Upstream:  N/A                |  Chapoton
         Branch:                     |  Work issues:
   Dependencies:  #11224, #8327,     |       Commit:
  #10193, #12895, #14516, #14722,    |     Stopgaps:
  #13589, #14471, #15069, #15094,    |
  #11688, #13394                     |
-------------------------------------+-------------------------------------

Comment (by nbruin):

 Replying to [comment:159 jdemeyer]:
 > This happens sometimes:
 > {{{
 >     Exception RuntimeError: 'maximum recursion depth exceeded while
 calling a Python object' in <sage.structure.coerce_dict.TripleDictEraser
 object at 0x17814b0> ignored
 >     Exception RuntimeError: 'maximum recursion depth exceeded while
 calling a Python object' in <cyfunction
 WeakValueDictionary.__init__.<locals>.callback at 0x2421410> ignored
 > }}}
 Looks similar to #15069, so the most likely scenario is that there is a
 very complicated data structure that gets garbage collected and that the
 decref of something initiates a chain of subsequent decrefs that is more
 than 1000 deep.

 It seems there are unresolved issues in python with this stuff. See
 [http://bugs.python.org/issue483469] for an even worse (segmentation fault
 inducing!) problem with `__del__`. It looks like the python "maximum
 recursion depth" is avoided there via a similar trick to #15069, leading
 do a C-stack overflow as a result (hence the harder crash). Indeed:
 {{{
 sage: class A: pass
 sage: a=A(); prev=a;
 sage: from sage.structure.coerce_dict import MonoDict
 sage: M = MonoDict(11)
 sage: for i in range(10^5): newA = A(); M[prev] = newA; prev = newA
 sage: del a
 Segmentation fault
 }}}
 (the value `10^5` may need adjustment, depending on your C-stack), showing
 that with the fix on #15069 we only postpone the problem with some order
 of magnitudes, and get a worse problem instead.

 I suspect we're hitting here the same problem (note that for a
 `WeakValueDictionary` we have to chain in the other direction):
 {{{
 sage: a=A(); prev=a;
 sage: M=WeakValueDictionary()
 sage: for i in range(10^3+10): newA = A(); M[newA] = prev; prev = newA
 sage: del a
 Exception RuntimeError: 'maximum recursion depth exceeded while calling a
 Python object' in <cyfunction
 WeakValueDictionary.__init__.<locals>.callback at 0x6a527d0> ignored
 }}}
 This problem goes away if we instead define
 {{{
 sage: class A(object): pass
 }}}
 Probably, old-style objects do not participate in the "trashcan" but new-
 style objects do (see #13901 and
 [http://trac.cython.org/cython_trac/ticket/797 cython ticket#797]; we need
 this on cython classes too), which flattens call-stacks during
 deallocation.

 The problem also doesn't occur with `weakref.WeakValueDictionary`,
 probably also because there are sufficiently many general python
 structures involved to let the trashcan kick in.

 Oddly enough, replacing `object` above by `SageObject` or `Parent` seems
 to also work, so the scenario we're running into is probably not exactly
 what described here.

--
Ticket URL: <http://trac.sagemath.org/ticket/10963#comment:162>
Sage <http://www.sagemath.org>
Sage: Creating a Viable Open Source Alternative to Magma, Maple, Mathematica, 
and MATLAB

-- 
You received this message because you are subscribed to the Google Groups 
"sage-trac" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/sage-trac.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to