Hi,

It seem we don't infer the broadcasts pattern exactly in both cases. Just
call T.patternbroadcast on the output of uniform with the broadcast of the
parameter if you are sure the shape is the same.

Something like
Noise =...
Noise = T.patternbroadcast(Noise, params[0]. broadcastable)

Le 24 sept. 2016 18:47, "Kiuhnm Mnhuik" <[email protected]> a écrit :

> Hello,
>
> I don't understand why
>
>     grads = T.grad(loss, params)
>     rng = RandomStreams(get_rng().randint(1, 2147462579))
>     noise = rng.normal(grads[0].shape, dtype=grads[0].dtype)
>     grads[0] = grads[0] + noise
>
> works, but
>
>     grads = T.grad(loss, params)
>     rng = RandomStreams(get_rng().randint(1, 2147462579))
>     noise = rng.uniform(grads[0].shape, dtype=grads[0].dtype)    # only
> difference: normal -> uniform
>     grads[0] = grads[0] + noise
>
> doesn't. The error is
>
> TypeError: ('An update must have the same type as the original shared
> variable (shared_var=W, shared_var.type=TensorType(float32, col),
> update_val=Elemwise{sub,no_inplace}.0, update_val.type=TensorType(float32,
> matrix)).', 'If the difference is related to the broadcast pattern, you can
> call the tensor.unbroadcast(var, axis_to_unbroadcast[, ...]) function to
> remove broadcastable dimensions.')
>
> --
>
> ---
> You received this message because you are subscribed to the Google Groups
> "theano-users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> For more options, visit https://groups.google.com/d/optout.
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to