Hi everyone,

I was using SymPy to derive an analytic expression for a Jacobian matrix in 
a least squares system.  It worked very well, and I simplified the system 
down to an equation which is very easy to evaluate, which however involves 
a vector of residuals:

The final SymPy expression is this:

Sum((-10.0*exp(-0.1*t) + 10.0)*d, (t, 1, 50))


Where t is a variable, and d is an indexed base.  How can I numerically replace 
d with a vector containing 50 residuals, and actually compute the sum?  It 
would be trivial to code this up in numpy using a loop, but since I was 
preparing an iPython notebook, I thought it would be nice to do as much as 
possible within SymPy.


The equivalent numpy code would be this:


tot = 0.0

for t in range(50):

    tot += (-10*np.exp(-0.1*t) + 10)*d[t]



-- 
You received this message because you are subscribed to the Google Groups 
"sympy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/sympy.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to