The problem is called the "proviso" problem, as in things like
1/x provided x != 0
I did thesis work in this area. One of the attacks is to use
cylindrical algebraic decomposition (CAD). A second approach
is to create a tree of expressions based on intervals such as
[1/x where x < 0] [1/x where x = 0] [1/x where x > 0]
The computation proceeds in 3 parts, each one under an interval
assumption. If a new proviso is added the program tree-forks again:
[1/x where x < 0 && 1/y where y < 0]
[1/x where x < 0 && 1/y where y = 0]
[1/x where x < 0 && 1/y where y > 0]
[1/x where x = 0 && 1/y where y < 0]
[1/x where x = 0 && 1/y where y = 0]
[1/x where x = 0 && 1/y where y > 0]
[1/x where x > 0 && 1/y where y < 0]
[1/x where x > 0 && 1/y where y = 0]
[1/x where x > 0 && 1/y where y > 0]
Note that these can all be run in parallel and the results
combined after the computation. This approach lends itself
to massively parallel computations.
I did a literature analysis of a couple hundred textbooks
and found that approximately 80 percent of all equation
provisos could be rewritten into interval constraints.
A simple "assume" facility does not begin to address the
problem for many reasons, such as the fact that provisos
can arise during intermediate computations.
Tim Daly
On 9/23/2010 8:36 AM, Burcin Erocal wrote:
On Wed, 22 Sep 2010 11:40:44 -0700 (PDT)
rjf<fate...@gmail.com> wrote:
Many features in Maxima do not use the "assume" features at all.
If Macsyma were to be redesigned from the ground up, the issues
related to assume etc would probably be addressed at a foundational
level.
To the extent that other computer algebra systems claim to be a fresh
look at issues, it appears that they have all failed to address this
one.
Instead they ignore "assumptions" and later patch them on in peculiar
ways
and provide access to this information only from some specific
programs, e.g
Mathematica's Integrate, Reduce, Simplify. But probably not much
else.
So this known problem (at least since 1974) was off the radar of the
brainiacs
who designed all those subsequent systems, including I suppose, Sage.
I think it would be a huge overstatement to say that the symbolics
subsystem in Sage was "designed" in any way. IMHO, it was mostly
patched together to support educational use, then acquired more cruft
through several rewrite attempts and cramped schedules.
I am definitely not an expert in this field and have no idea how the
assumptions should work. If you can provide some references, perhaps
these could be used as starting points when/if somebody decides to
work on this.
Here is the only reference I found on this topic:
http://portal.acm.org/citation.cfm?id=680466
The article is available for download here (for those with access):
http://www.springerlink.com/content/p77364025wh6j7h5/
Burcin
--
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org