I see, thanks Pascal! Shame map_variables doesn't do the trick in this
case. I think I'll go with the manual approach you recommended as it seems
the most efficient and relatively straight forward in my case.
On Friday, October 14, 2016 at 5:13:53 PM UTC-7, Pascal Lamblin wrote:
>
> On Sat, Oc
On Sat, Oct 15, 2016, Pascal Lamblin wrote:
> Another option, still experimental, may be the `map_variables` function
> in scan_modules/scan_utils.
There seem to be some challenges regarding scalar constants with that
function, but I was able to do the following:
>>> theano.tensor.basic.constant.
Hi,
Yes, it is an actual problem that we never managed to fix in a
satisfactory way. The current behaviour is inconsistent.
Doing the substitution one at a time is a workaround, I think Blocks
does that for dropout, but it can be cumbersome to have everything
cloned over and over again.
Another
Hello,
I'm trying to use theano.clone to implement dropout in my MLP network.
Because I want to apply dropout at multiple layers, I pass the clone call
multiple key value pairs to its replacement parameter:
replace={layer1:mask*layer1, layer2:mask*layer2, etc} however the graph
that's returne