[theano-users] Split Op (OpFromGraph) to save intermediate results for grad

2017-07-31 Thread nicolas . granger . m
I am trying to build an Op with a custom/optimized gradient formula. To override the automatic differenciation, I'm trying to use OpFromGraph. The gradient formula can reuse intermediate results from the feed forward pass, so I have tried to split the Op in two: Op1 computes the intermediate

[theano-users] Re: Unused input error with chained OpFromGraph ops

2017-07-17 Thread nicolas . granger . m
Hello, I still haven't managed to trace the error down. Below is a shorter example that triggers the error. It seems theano tries to create a variable for the output gradient for a node through which I do not back propagate. At some point it hits a DisconnectedType instance and raises an

[theano-users] Unused input error with chained OpFromGraph ops

2017-07-11 Thread nicolas . granger . m
Hi, I am trying to split an computation over two ops in order to avoid spurious computations when computing the gradient. My current attempt uses a first op which returns the desired result for the forward part and extra intermediate results. The second op just forwards the desired result, but

[theano-users] Re: Unused input error with chained OpFromGraph ops

2017-07-13 Thread nicolas . granger . m
Hi, Thank you for the suggestion, actually inlining makes more sense for what I am trying to do. However, a casting issue arises when trying to compute the derivative wrt to the continuous input. If I understood correctly, DisconnectedInput should be returned as the gradient for integral

Re: [theano-users] Split Op (OpFromGraph) to save intermediate results for grad

2017-08-09 Thread nicolas . granger . m
"forward the precomputed output" means that Op1 already computed the final output, therefore Op2 just has to behaves as identity in the forward pass The intermediate value is already an output of Op1 as shown in the example code, sorry if that wasn't clear. Nicolas Le mardi 8 août 2017

Re: [theano-users] Numpy error during optimization phase

2018-02-04 Thread nicolas . granger . m
I'm using numpy 1.14.0 from the popular conda-forge repository. BTW, I was wrong with the GPU/CPU distinction: the errror is triggered in either case. -- --- You received this message because you are subscribed to the Google Groups "theano-users" group. To unsubscribe from this group and

Re: [theano-users] Numpy error during optimization phase

2018-02-13 Thread nicolas . granger . m
Sorry for the delay, I just re-ran it in a clean conda environnement, here are my system specs: OS: archlinux nvidia: 390.25 cuda: 9.1.85 numpy: 1.14.0 pygpu: 0.7.5 theano: git master .theanorc: [global] device = cuda floatX = float32 warn_float64 = warn on_opt_error = raise [nvcc] fastmath =

[theano-users] Numpy error during optimization phase

2018-01-24 Thread nicolas . granger . m
Hi everyone, While using an OpFromGraph involving some operations with binary values, there is an optimization error: theano.gof.opt: ERROR: Optimization failure due to: local_add_canonizer > theano.gof.opt: ERROR: node: > Elemwise{add,no_inplace}(InplaceDimShuffle{0,1,x}.0, >