Hi Lijun,

I have been out of keras for a period. But I think your problem is 
something easy to debug. You can directly draw the graph yourself by 
plotting all the nodes and arcs, and certainly you can use the 
visualization function provided by keras: https://keras.io/visualization/. 
Hope this would be helpful! 

Good luck!
Yang

On Friday, December 2, 2016 at 4:29:47 PM UTC+8, Lijun Wu wrote:
>
> Hi Yang,
>
> can you share more about how to draw the graph and label parameter to 
> debug? I met one some problem, but my usage is different:
> I first trained a modelA, then I use this modelA just as part of an 
> function during training modelB, and modelA should be fixed, but when I try 
> to compile modelB, it dropped this error: DisconnetedInputError: xxxx by a 
> non-differentiable operator: b.
> But I even don't know where this b occurred, could you give some advice? 
> Thanks. 
>
> On Wednesday, January 20, 2016 at 9:24:21 AM UTC+8, Yang Xiang wrote:
>>
>> Thanks Daniel,
>>
>> I tried to draw out the whole graph and label all the parameters along 
>> the path, and finally fixed the problem. The name of one parameter (when I 
>> copied the paramters from the forward RNN to backward) was misspelled. 
>>
>> Finally I still have no idea about why the NIL params for grad would also 
>> caused this problem, and I suggest this warning (exception) should be threw 
>> along with the name of the disconnected paramter but not only some type 
>> information. And one thing more I learned is that drawing the graph and 
>> labeling the parameters is a good way for debugging this disconnect error.
>>
>> Yang
>>
>> 在 2016年1月17日星期日 UTC+8下午9:53:53,Daniel Renshaw写道:
>>>
>>> Are you saying you have code of this form:
>>>
>>> import theano.tensor as tt
>>>
>>> x = tt.matrix()
>>> c = tt.sum(2 * x)
>>> gs = tt.grad(c, [])
>>>
>>> i.e. an attempt to compute the gradient of some cost c with respect 
>>> to... nothing, is generating the exception whose details you posted?
>>>
>>> If so we'll probably need to see what the cost computation is, can you 
>>> share more code? Have you been able to reproduce the problem with simple 
>>> code that can be executed without any external dependencies?
>>>
>>> Daniel
>>>
>>>
>>>
>>> On 16 January 2016 at 15:02, Yang Xiang <[email protected]> wrote:
>>>
>>>> Hi all,
>>>>
>>>> I encountered theano.gradient.DisconnectedInputError when I wrote my 
>>>> code for an end-to-end process. I have a series of parameters to update. 
>>>> In 
>>>> order to check which parameter caused the disconnect error, I removed them 
>>>> from the function's parameters one by one. But after I removed all the 
>>>> parameters (params=[]), this error was still there? What does this case 
>>>> mean?
>>>>
>>>> The error report stated: theano.gradient.DisconnectedInputError: grad 
>>>> method was asked to compute the gradient with respect to a variable that 
>>>> is 
>>>> not part of the computational graph of the cost, or is used only by a 
>>>> non-differentiable operator: <TensorType(float64, 4D)>
>>>>
>>>> Could anyone help?
>>>>
>>>> Thanks.
>>>>
>>>> Yang
>>>>
>>>> -- 
>>>>
>>>> --- 
>>>> You received this message because you are subscribed to the Google 
>>>> Groups "theano-users" group.
>>>> To unsubscribe from this group and stop receiving emails from it, send 
>>>> an email to [email protected].
>>>> For more options, visit https://groups.google.com/d/optout.
>>>>
>>>
>>>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to