I figured out the error. There was a padding operation which was not differentiable.
On Monday, February 27, 2017 at 7:52:37 PM UTC-8, Asghar Inanlou Asl wrote:
>
> Hi all,
> When I run the following code, I am getting probably one of the most basic
> errors, which is:
>
> theano.gradient.DisconnectedInputError: grad method was asked to compute
> the gradient with respect to a variable that is not part of the
> computational graph of the cost, or is used only by a non-differentiable
> operator: w_conv3d_l1
>
> Any ideas?
> Thanks!
>
>
> def conv_3d(inpt, filter_shape, stride=(1,1,1), layer_name='', mode=
> 'valid'):
>
>
> w = theano.shared(np.asarray(np.random.normal(loc=0, scale=np.sqrt(1.
> / np.prod(filter_shape)), size=filter_shape),
>
>
> dtype=theano.config.floatX), name=
> 'w_conv3d_' + layer_name, borrow=True)
>
>
> b = theano.shared(
>
>
>
> np.asarray(np.random.normal(loc=0.0, scale=1.0,
> size=[filter_shape[0]]), dtype=theano.config.floatX),
>
>
> name='b_conv3d_' + layer_name, borrow=True)
>
>
>
>
> return T.nnet.conv3D(inpt, w, b, stride), [w, b]
>
>
>
> if __name__ == "__main__":
>
> X = T.TensorType(theano.config.floatX, (False,)*5)('x')
>
>
>
>
>
>
>
>
> L1, l1_params = conv_3d(X,(1,5,5,5,1), mode='same', layer_name='l1')
>
>
>
>
> L4, l4_params = conv_3d(L1, (1,5,5,5,1), mode='same', layer_name='l2')
>
>
>
>
>
>
>
> cost = T.sum((X - L4)**2)
>
>
>
>
>
>
>
> params = l4_params
>
>
>
> params += l1_params
>
>
>
> grads = T.grad(cost, params)
>
>
>
> mode = theano.compile.get_default_mode()
>
>
>
> mode = mode.including('conv3d_fft', 'convtransp3d_fft',
> 'convgrad3d_fft')
>
>
> x = np.random.rand(1,10,10,10,1)
>
>
>
> updates = [(param, param-grad) for param, grad in zip(params,
> grads)]
>
>
> get_cost = theano.function([], cost, updates=updates, givens={X:
> x.astype(theano.config.floatX)}, allow_input_downcast=True, mode=mode)
>
--
---
You received this message because you are subscribed to the Google Groups
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
For more options, visit https://groups.google.com/d/optout.
