I'm facing a similar problem.

I'm trying to implement a VAE for audio.

the code that is causing me the problem are briefed :

z_mean = Dense( self.z_dim , init=initialization , activation='linear')(H2)
z_mean = LeakyReLU(alpha=.001)(z_mean)

z_log_var = Dense( self.z_dim , init=initialization , 
activation='linear')(H2)
z_log_var = LeakyReLU(alpha=.001)(z_log_var)

 z = Lambda(self.sampling ,  output_shape=K.int_shape(z_mean) )([z_mean, 
z_log_var])
H3 = Dense(input_dim - 1, init=initialization , activation='linear')(z) 
 #causing all troublle 

grads fail to communicate information to the z_log_var and z_mean. 
When I do 
grads = K.gradients(cost, trainable_vars)

"Backtrace when that variable is created " shows me above line 




On Saturday, January 16, 2016 at 9:02:06 AM UTC-6, Yang Xiang wrote:
>
> Hi all,
>
> I encountered theano.gradient.DisconnectedInputError when I wrote my code 
> for an end-to-end process. I have a series of parameters to update. In 
> order to check which parameter caused the disconnect error, I removed them 
> from the function's parameters one by one. But after I removed all the 
> parameters (params=[]), this error was still there? What does this case 
> mean?
>
> The error report stated: theano.gradient.DisconnectedInputError: grad 
> method was asked to compute the gradient with respect to a variable that is 
> not part of the computational graph of the cost, or is used only by a 
> non-differentiable operator: <TensorType(float64, 4D)>
>
> Could anyone help?
>
> Thanks.
>
> Yang
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to