General models of Neural Networks compute the error using its output, 
compared with the groundtruth by some custom function. I'm trying build a 
model in witch the output vector *O* of a CNN is used to extract a feature 
vector *F* from the original input images *I*. I mean I want to compute 
error and gradients of the model using *F*, not directly *O.* My problem is 
that Theano doesn't "understand" *F* vector as part of computational graph 
cost and raises the following error:



*File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py", line 
532, in handle_disconnected    raise 
DisconnectedInputError(message)theano.gradient.DisconnectedInputError: grad 
method was asked to compute the gradient with respect to a variable that is 
not part of the computational graph of the cost, or is used only by a 
non-differentiable operator: convolution2d_1_W*

I tried to pass *disconnected_inputs='ignore'* to the *T.grad()* function 
and my program could run, but it means that the partial derivative of my 
loss function with respect to *F* will be 0 (zero) and it doesn't solve my 
problem. How can I make Theano understand *F* as part of computational cost? 

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to