Joseph Turian <joseph@...> writes:

> 
> >> Can anyone compare Theano and openopt for automatic differentiation?
> >
> > I guess I could, but it will hardly be objective been done by its developer.
> 
> I don't mind a biased perspective, please offer it to give your
> feedback on what you emphasized.
 

I don't know what do you mean to compare - 
its speed, possibilities, convenience etc.

1st, FuncDesigner is 100% Python+numpy code (w/o C, Fortran etc),
that simplifies its development, installation and debug.

2nd, syntax of Theano AD seems somewhat complicated in comparison to FD.

Consider the example of getting derivative for x**2 from Theano doc:

http://deeplearning.net/software/theano/tutorial/gradients.html

>>> from theano import pp
>>> x = T.dscalar('x')
>>> y = x ** 2
>>> gy = T.grad(y, x)
>>> pp(gy)  # print out the gradient prior to optimization
'((fill((x ** 2), 1.0) * 2) * (x ** (2 - 1)))'
>>> f = function([x], gy)
>>> f(4)
array(8.0)
>>> f(94.2)
array(188.40000000000001)

in FD:
from FuncDesigner import *
a = oovar('a')
f = a**2
point1 = {a:94.2}
point2 = {a:[1,2,3,4]}
print(f.D(point1)) 
# {a: 188.4}
print(f.D(point2))
# {a: array([[ 2.,  0.,  0.,  0.],
       [ 0.,  4.,  0.,  0.],
       [ 0.,  0.,  6.,  0.],
       [ 0.,  0.,  0.,  8.]])}

in FD all func definitions would look like ordinary numpy code,
so if you have ordinary python file with numpy funcs / variables,
all you have to do to involve FD - replace 
"from numpy import sin, cos, ..." by
"from FuncDesigner import sin, cos, ..."

(see also "Passing oovars through ordinary Python functions
http://openopt.org/FuncDesignerDoc#Passing_oovars_through_ordinary_Python_functions
)

3rd, You can involve DerApproximator 
(finite-difference derivatives approximation, 
http://openopt.org/DerApproximator ) 

for AD of those parts of code that are not FD-created, 
e.g. connected from C, Fortran funcs, like fft

(see example athttp://openopt.org/FuncDesignerDoc#Creating_special_oofuns )

4th, unlike Theano FD can perform sparse AD

5th, FD and its AD are well integrated with essential optimization framework
(OpenOpt),

no need to connect those gradients for 
each constraint and objective
one-by-one.

You should took a look at FD doc ( http://openopt.org/FuncDesignerDoc )
for further differences; 

for AD you should also pay attention to
http://openopt.org/FuncDesignerDoc#Translator .

Of course, Theano has some pros over FD, e.g. it already can handle 2nd
derivatives.


------------------------------------------------------------------------------
Don't let slow site performance ruin your business. Deploy New Relic APM
Deploy New Relic app performance management and know exactly
what is happening inside your Ruby, Python, PHP, Java, and .NET app
Try New Relic at no cost today and get our sweet Data Nerd shirt too!
http://p.sf.net/sfu/newrelic-dev2dev
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to