>
>
>>>> I do not know what exactly these are.
>>>> I read that AC unification can be polynomial, which sounds like a
>>>> potential performance bottleneck; do we really need it?
>>>>
>>>
>>> Given expression (like sin(2*x) + cos(2*x)) we need to match against many
>>> possible patterns (like, every known trig identity).
>>>
>>
> I'm assuming that this is what was meant by "multi-pattern unification".
> It's essentially a search in a potentially infinite decision tree. I think
> that's already polynomial, possibly even undecidable.


The large decision tree search problem comes after you determine which
patterns might possibly match.  That is also a fun problem.  To be
explicit, given an expression and a knowledgebase of patterns first see
which patterns are valid, then choose which patterns to apply.  Both of
these operations are interesting.

>> It's possible to store
>
>> all of the patterns in a good data structure so that we can check them all
>>> simultaneously (see Trie for something similar).  We need to do this
>>> matching in a way that is aware of commutative operators (like +).  Naive
>>> solutions to this are very slow.  There exists sophisticated solutions.
>>>
>>
> Can't we deal with commutativity by normalizing the tree?
> E.g. in polynoms, it's already commonplace to put the higher exponents to
> the left; set up a total order over expressions that includes this and
> similar conventions.
> Similar for associative operators; there, we turn A+(B+C) into plus(A,B,C)
> and don't have to worry about burdening the unification algorithm with
> semantic subtleties.


For commutativity, yes, for associativity and commutativity no.

Match x + B to  A+B+C.  This problem is typically solved for a single
pattern by performing a bipartite matching.  It's more complex when there
exists several patterns and you don't want to check each of them linearly.

>
>  For trig simplification, there is a paper that I think uses some
>> systematic way
>> to simplify it. So it might be that for that you don't need to check
>> all the trig
>> identities.
>>
>
Yeah, so this is actually in SymPy.   Chris Smith already implemented this
in fusimp.  Note that this doesn't use pattern matching but it does search
through a tree of small transformations.

    >>> fu(sin(x)/cos(x))  # default objective function
    tan(x)
    >>> fu(sin(x)/cos(x), measure=lambda x: -x.count_ops()) # maximize op
count
    sin(x)/cos(x)

I used the strategies module to pull out the tree search stuff.  See either
sympy.strategies or github.com/logpy/strategies/
It is used in fu simp here

https://github.com/sympy/sympy/blob/821fc389bcb7285555f6ec46c32afa38faceeab1/sympy/simplify/fu.py#L1597

I think that's the general approach. Do not exhaust the decision tree,
> follow a search strategy.
> Applies to trig, integrals, polynom simplification, and probably almost
> all areas of mathematics.
>
> Can this be captured as a simple priority value on each substitution rule?


You can get pretty far with a simple priority value.  Often times greedy
solutions don't produce optimal results.  My solution to this is to follow
a greedy solution but then allow backtracking, allowing the user to ask for
extra results.

-- 
You received this message because you are subscribed to the Google Groups 
"sympy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/sympy.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/sympy/CAJ8oX-GKTNEkdakDCJf-eSW70cRC4xPD_dnyCQjAuTrLtgEFKw%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to