This can happen in some corner cases when it correspond to a shared
variable. Add this line to force the output to be on CPU.

grads = [g.transfer('cpu') for g in grads]

Fred

On Mon, Mar 27, 2017 at 4:52 AM duy kien nguyen <[email protected]>
wrote:

>
> I tried to build a function returning the gradients of shared variables in
> theano. However, the function returned output that was not numpy array
> but pygpu.gpuarray.GpuArray. How can I deal with this problem?
>
> Here is my function:
>
> --------------------------------------------------------------------------------------------------------------------------
> cost  = classifier.negative_log_likelihood(answer, answer_mask)
> grads = T.grad(cost, itemlist(classifier.tparams))
>
> optim = optimizer()
> updates = optim(itemlist(classifier.tparams), grads)
>
> # out_grads = [grad for grad in grads]
>
> grads_check = theano.function(
> inputs=[],
> outputs=grads,
> givens={
> img: shared_img, ques: shared_ques_casted, ans: shared_ans_casted[:-1],
> m_ques: shared_m_ques, m_ans: shared_m_ans[:-1],
> answer: shared_ans_casted[1:], answer_mask: shared_m_ans[1:]
> }
> )
>
> ----------------------------------------------------------------------------------------------------------------------------
>
> --
>
> ---
> You received this message because you are subscribed to the Google Groups
> "theano-users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> For more options, visit https://groups.google.com/d/optout.
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to