#13991: Mitigate speed regressions in symmetric function related code due to 
#12313
---------------------------------+------------------------------------------
       Reporter:  nbruin         |         Owner:  sage-combinat
           Type:  enhancement    |        Status:  new          
       Priority:  major          |     Milestone:  sage-5.7     
      Component:  combinatorics  |    Resolution:               
       Keywords:                 |   Work issues:               
Report Upstream:  N/A            |     Reviewers:               
        Authors:                 |     Merged in:               
   Dependencies:  #13605         |      Stopgaps:               
---------------------------------+------------------------------------------
Description changed by nbruin:

Old description:

> As was found in #12313, there is some code, especially `k_dual.py`
> introduced in #13762, that seems to be extremely reliant for its
> performance on parents being immortal:
> {{{
> sage: from sage.combinat.sf.k_dual import DualkSchurFunctions
> sage: Sym = SymmetricFunctions(QQ['t'].fraction_field())
> sage: dks4 = DualkSchurFunctions(Sym.kBoundedQuotient(4))
> sage: X=dks4[0]+ 2*dks4[1] + 3*dks4[2]
> sage: X*X #takes surprisingly long with #12313 applied
> sage: (X*X)*X #takes surprisingly long with #12313 applied
> }}}
> Profiling gives some indication what is happening:
> {{{
> sage: import cProfile,pstats
> sage: cmd = "X*X"
> sage: s=pstats.Stats(cProfile.Profile().runctx(cmd,globals(),{}))
> sage: S=s.sort_stats('cumulative')
> sage: S.print_callers() # get call graph info
> }}}
> it seems certain parents are created again and again and the coercion
> discoveries need to be redone every time.
>
> Probably, the most straightforward solution is to equip the classes with
> appropriate caches so that they themselves are now ensuring the lifetime
> of parents they use rather than rely on sage to keep them around.

New description:

 As was found in #12313, there is some code, especially `k_dual.py`
 introduced in #13762, that seems to be extremely reliant for its
 performance on parents being immortal. Actually as it turned out it's not
 immortality, it was relying on non-unique parents comparing as equal,
 where parents should be unique and hence should be comparable by identity:
 {{{
 sage: from sage.combinat.sf.k_dual import DualkSchurFunctions
 sage: Sym = SymmetricFunctions(QQ['t'].fraction_field())
 sage: dks4 = DualkSchurFunctions(Sym.kBoundedQuotient(4))
 sage: X=dks4[0]+ 2*dks4[1] + 3*dks4[2]
 sage: X*X #takes surprisingly long with #12313 applied
 sage: (X*X)*X #takes surprisingly long with #12313 applied
 }}}
 Profiling gives some indication what is happening:
 {{{
 sage: import cProfile,pstats
 sage: cmd = "X*X"
 sage: s=pstats.Stats(cProfile.Profile().runctx(cmd,globals(),{}))
 sage: S=s.sort_stats('cumulative')
 sage: S.print_callers() # get call graph info
 }}}
 it seems certain parents are created again and again (indeed they are, but
 the problem here turns out to be that they are created newly rather than
 the existing parent being reused) and the coercion discoveries need to be
 redone every time.

 Probably, the most straightforward solution is to equip the classes with
 appropriate caches so that they themselves are now ensuring the lifetime
 of parents they use rather than rely on sage to keep them around.

--

-- 
Ticket URL: <http://trac.sagemath.org/sage_trac/ticket/13991#comment:27>
Sage <http://www.sagemath.org>
Sage: Creating a Viable Open Source Alternative to Magma, Maple, Mathematica, 
and MATLAB

-- 
You received this message because you are subscribed to the Google Groups 
"sage-trac" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/sage-trac?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to