Hi all,

I would like to calculate the gradient of an updated parameter, like

import theanoimport theano.tensor as T
x = T.dscalar('x')
cost = x ** 2
grad = T.grad(cost, x)
#update parameter
x=x+1
g_new = T.grad(cost, x)

The first gradient is calculated, but I get a DisconnectedInputError message 
for the second one which is

DisconnectedInputError                    Traceback (most recent call last)
<ipython-input-11-fbd941f2d325> in <module>()

      7 #update parameter
      8 x=x+1----> 9 g_new = T.grad(cost, x)
     10 pp(grad)  # print out the gradient prior to optimization
     11 pp(g_new)

I don't understand this error because the gradient calculation works in the 
first place. How do I need to change the update so that the gradient is 
calculated? Many thanks in advance.

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to