On Mon, 04 Dec 2006 16:03:45 -0800, Robert Bradshaw  
<[EMAIL PROTECTED]> wrote:
> On Mon, 4 Dec 2006, Joel B. Mohler wrote:
>> On Mon, Dec 04, 2006 at 09:46:18AM -0800, Robert Bradshaw wrote:
>>> I think we are thinking of these indetermanents wrongly--they are
>>> dummy variables and have should really have no context outside of the
>>> function definition. Another problem is that, if we say f = sin(a)+cos
>>> (x), then f(1,2) is ambiguous.

Because the order of arguments is not specified?   I disagree to some
extent, since you could likewise say that if f = a*x in Q[a,x], then  
f(1,2) is
ambigous.  Likewise, in the above example, just use the alphabetical
order of the variable names.

>>> f = sin(x) should be illegal (though I'm not sure how to throw an
>>> informative error here).

That means writing "sin(x)" is illegal, so

   show(plot(sin(x)))

is illegal, and lots of other things like

   integrate(sin(x), x)

In mathematica (and maxima) right now I can type

   Integrate[sin[x], x]

and it works, which is a good thing.  In Maxima, I can type

   integrate(sin(x), x)

and it works fine.   *This* is the main design goal.  I.e., to
be able to do computations like above and have them work directly
in SAGE (without having to type maxima('integrate(sin(x), x)')).
Thus making "f = sin(x)" illegal violates our basic design goal.

Also, the following should work:

    integrate(sin(x*y) + z, x)

and give 'x*z - cos(x*y)/y' as output.

Examples like above are what I meant when I requested a list
of samples of what our proposed calculus, etc., functionality
for SAGE should do.    In fact, an excellent way to decide
what we want is to take a bunch of calculus problems from
a calculus book and try to make up how you would type
them into SAGE.   If the way isn't fairly straightforward
and natural for undergraduates, we have more work to do.

In math single-letter variables names are totaly standard,
so we're OK with that.   A possible problem is what to do
about things like
    sin(x_0 + x_1 + x_2^3)
How do we make it possible to specify subscripted variables?
Maybe x[0] x[1] x[2], etc., and make it so our formal
variables a,...,z, A,...,Z have their __getitem__ method overloaded
to return new variables?

Moreover, and I can tell you this from lots of experience,
any use of the preparser must be avoided if at all possible.
Anything that involves the preparser is also bad, because it
is very painful and error prone to move code that was written
with the preparser to library .py code.   I always regret that
the preparser is used at all, but there is no escaping that it
is needed a little.

>>> Instead one should have to type f(x) = sin
>>> (x). This would be pre-parsed to something like
>>>
>>> with inject_indetermanents('x'):
>>>      f = sin(x)
>>>
>>> Here inject_indetermanents would be assign x to an indetermanent that
>>> knows both its name (for printing) and its position (in the tuple if
>>> f is called with multiple arguments, and x is recursively called down
>>> the tree). It would be a morphism of the category of sets and always
>>> act as the identity function on __call__. It could, say, know how to
>>> differentiate itself (return 1 if the variable with respect to
>>> integration matches self.name, 0 (or dx/dother) otherwise. It would
>>> support addition, exponentiation, etc. via a generic "sum" class that
>>> takes two functions and returns a function. Coercible into R[x], etc.
>>>
>>> (I'm not sure how contexts work, perhaps the all the indetermanents
>>> in the block would have to be wrapped... this is still doable via
>>> regular expressions...)

Pain!

>>>
>>> Any comments?
>>
>> I like this proposal a great deal.  It fixes all of my complaints that
>> I've been bringing up about polynomial rings.  It is however very
>> difficult to actually implement this with-out pre-parser logic.  I'm
>> not certain how much of a downside that is.
>
> Yes, though at least here the preparser rule is stateless and simple. The
> "pythonic" way of defining functions is
>
> def f(x):
>      return x - sin(x)
>
> or even
>
> f = lambda x: x - sin(x)
>
> which doesn't allow us to do, say, (symbolic) calculus on the returned
> object--we'd probably want the returned result to be an instance of
> ElementaryFunction. It seems reasonable to use the preparser to create a
> simple, intuitive way of defining _mathematical_ functions.
>
> Also, this could co-exist with whatever (predefined or not) values are
> assigned to x.
>
>> One other downside is that these indeterminants would not be
>> elements in the polynomial ring and so it seems that they would not
>> benefit from the speed of the polynomial arithmetic architecture.  That
>> is to say, you could have polynomials in sage which don't know they
>> are polynomials so I guess all the computation would use the most
>> naive methods of computation and not utilize the heavily optimized
>> polynomial code.  This would be rather sad.
>
> This is true, but I think that if someone wants the full speed and
> functionality of polynomials one should create the object as a polynomial
> (or at least cast it into such a ring--this should be fully supported).
> Also, polynomials (currently) are always stored in their expanded  
> version,
> which may or may not be desired (I'm thinking of calculus students here).


--~--~---------~--~----~------------~-------~--~----~
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~----------~----~----~----~------~----~------~--~---

Reply via email to