>> However, while we can generate Fortran code, compile it and use
>> that for plotting, in many cases it is an overkill and a simple approach
>> 1) or 2) provides enough speed to get the job done.
>
>
> I'm not sure that this is the right kind of overkill we're looking at here.

There is one more thing. If a user is **not** relying at least on
numpy for evaluation of arrays or on autowrap for evaluation of many
unpredictable points, he is just doing it wrong.

The one exception is code inside sympy which was agreed at some point,
should not require more than a python interpreter. However, if you are
using nsolve with lambdify you are again doing it wrong, because the
only advantage that our nsolve has over scipy is that it can work in
arbitrary precision. If you don't need it, just work in scipy, do not
kill precision for performance inside sympy. The other example is
plotting, however because of matplotlib this already requires numpy.

Hence, with all the other stuff that needs refactoring in sympy, it
does not seem wise IMO to spend time on complicated AST using module,
when we already have low-performance evalf and easy-to-implement high
performance eval_to_numpy idea.

-- 
You received this message because you are subscribed to the Google Groups 
"sympy" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/sympy?hl=en.

Reply via email to