*******************************************************************
Python code:

a= Symbol('a')
b= Symbol('b')
c= Symbol('c')

x = a*b+a*c
print "x=",x
y = factor(x)
print "factor(x)=",y
y = simplify(x)
print "simplify(x)=",y

a = sympify(1.2)

print " "
x = a*b+a*c
print "x=",x
y = factor(x)
print "factor(x)=",y
y = simplify(x)
print "simplify(x)=",y

y = simplify(factor(x))
print "simplify(factor(x))=",y
*********************************************************************

Output from code using sympy 7.1 and also bleeding-edge sympy:

x= a*b + a*c
factor(x)= a*(b + c)
simplify(x)= a*(b + c)
 
x= 1.2*b + 1.2*c
factor(x)= 1.2*(b + c)
simplify(x)= 1.2*b + 1.2*c
simplify(factor(x))= 1.2*b + 1.2*c

*************************************************************

why does simplify prefer:

1.2*b + 1.2*c

over:

1.2*(b + c)

especially since it prefers a*(b+c) over a*b + a*c?

   Larry Wigton

-- 
You received this message because you are subscribed to the Google Groups 
"sympy" group.
To view this discussion on the web visit 
https://groups.google.com/d/msg/sympy/-/4EJsX9lTiNQJ.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/sympy?hl=en.

Reply via email to