when I read the documents about Creating a new Op, I can't understand the 
grad() in the examples 
http://deeplearning.net/software/theano/extending/extending_theano.html#example-op-definition.
 
Why do they return output_grads[0] * 2  not 2? and what's output_grads[0] 
represent for? 
If output_grads[0] represent a chain derivative with respect to the input 
x, in the next example 
http://deeplearning.net/software/theano/extending/extending_theano.html#example-props-definition,
 
why the grad() return 
<http://deeplearning.net/software/theano/extending/extending_theano.html#example-props-definition>

a * output_grads[0] + b 
<http://deeplearning.net/software/theano/extending/extending_theano.html#example-props-definition>
 not a * output_grads[0]?


How about a more complicated custom Op? Like y = 
exp(x1)/(a*(x1**3)+log(x2)), how to write its grad()?

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to