I'm working on the network of the deeplearning LeNet and I applied the reul 
to the convolution. I'm not sure if all the parameters are included in the 
gradient will consider every function in the graph. 

the convolution is: 

conv_out = conv.conv2d(
 input=input,
 filters=self.W,
 filter_shape=filter_shape,
 image_shape=image_shape
)

rectified_conv_out = T.nnet.relu(conv_out + self.b.dimshuffle('x', 0, 'x', 'x'))

pooled_out = downsample.max_pool_2d(
 input= rectified_conv_out,
 ds=poolsize,
 ignore_border=True
)

while the gradient is 


updates = [
 (param_i, param_i - learning_rate * grad_i)
 for param_i, grad_i in zip(classifier.params, grads)
]

and I did save all the parameters in params. 

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to