I am trying to build an Op with a custom/optimized gradient formula. To
override the automatic differenciation, I'm trying to use OpFromGraph.
The gradient formula can reuse intermediate results from the feed forward
pass, so I have tried to split the Op in two: Op1 computes the intermediate
Hello,
I still haven't managed to trace the error down. Below is a shorter example
that triggers the error. It seems theano tries to create a variable for the
output gradient for a node through which I do not back propagate. At some
point it hits a DisconnectedType instance and raises an
Hi,
I am trying to split an computation over two ops in order to avoid spurious
computations when computing the gradient.
My current attempt uses a first op which returns the desired result for the
forward part and extra intermediate results. The second op just forwards
the desired result, but
Hi,
Thank you for the suggestion, actually inlining makes more sense for what I
am trying to do.
However, a casting issue arises when trying to compute the derivative wrt
to the continuous input. If I understood correctly, DisconnectedInput
should be returned as the gradient for integral
"forward the precomputed output" means that Op1 already computed the final
output, therefore Op2 just has to behaves as identity in the forward pass
The intermediate value is already an output of Op1 as shown in the example
code, sorry if that wasn't clear.
Nicolas
Le mardi 8 août 2017
I'm using numpy 1.14.0 from the popular conda-forge repository. BTW, I was
wrong with the GPU/CPU distinction: the errror is triggered in either case.
--
---
You received this message because you are subscribed to the Google Groups
"theano-users" group.
To unsubscribe from this group and
Sorry for the delay, I just re-ran it in a clean conda environnement, here
are my system specs:
OS: archlinux
nvidia: 390.25
cuda: 9.1.85
numpy: 1.14.0
pygpu: 0.7.5
theano: git master
.theanorc:
[global]
device = cuda
floatX = float32
warn_float64 = warn
on_opt_error = raise
[nvcc]
fastmath =
Hi everyone,
While using an OpFromGraph involving some operations with binary values,
there is an optimization error:
theano.gof.opt: ERROR: Optimization failure due to: local_add_canonizer
> theano.gof.opt: ERROR: node:
> Elemwise{add,no_inplace}(InplaceDimShuffle{0,1,x}.0,
>