Hello,

In theano 0.8.2, T.nnet.sigmoid implemented a grad method:


<https://lh3.googleusercontent.com/-HbNMPmAXf6w/WIuT-0QDh0I/AAAAAAAAFK4/LON8RRdnG0QaS_xoyKI6bCj3zJNgnCzFgCLcB/s1600/Screen%2BShot%2B2017-01-27%2Bat%2B10.21.55%2BAM.png>


However, in the bleeding edge version of theano, this method doesn't exist:
<https://lh3.googleusercontent.com/-OydJN6Fq_ho/WIuUdA-zaPI/AAAAAAAAFLA/YpvhqdTtFfUbM_mJiGtPBSlXW3LD1CIPwCLcB/s1600/Screen%2BShot%2B2017-01-27%2Bat%2B10.41.31%2BAM.png>

And yet, if I use the sigmoid function to compute something and call T.grad 
on the result, it works:
<https://lh3.googleusercontent.com/-CI3QuTfWltE/WIuW_QbTAeI/AAAAAAAAFLY/0lSzHFPibG4GKSU6uxvPdcRDBUShum7vgCLcB/s1600/Screen%2BShot%2B2017-01-27%2Bat%2B10.52.13%2BAM.png>

I looked at a number of other ops and it appears they still retain their 
grad methods. I'm wondering why the grad method was deprecated for 
Elemwise, and if there's an easy way to access the sigmoid gradient 
operation in the current version of theano?

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to