We haven't implemetned the grad for the diff op if the input isn't a vector.

Do you want to implement your cases?

Fred

On Thu, Jul 21, 2016 at 11:54 AM, Aditya Gudimella <
[email protected]> wrote:

> a = shared(np.arange(3*8).reshape(8,3).astype('float32'))
> b = diff(a, axis=0)
> T.grad(b.norm(2), a)
>
>
> This gives me an error. It's unable to find the gradient of the diff
> operator if it has been used on a 2 dimensional tensor. Is this how it was
> meant to be or is this a bug?
>
> --
>
> ---
> You received this message because you are subscribed to the Google Groups
> "theano-users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> For more options, visit https://groups.google.com/d/optout.
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to