Sorry, but I'm not able to answer this grad question. Hopefully someone
else that better understand that part can answer.
Fred
On Mon, Jul 31, 2017 at 9:43 AM wrote:
> I am trying to build an Op with a custom/optimized gradient formula. To
> override the automatic differenciation, I'm trying to
"forward the precomputed output" means that Op1 already computed the final
output, therefore Op2 just has to behaves as identity in the forward pass
The intermediate value is already an output of Op1 as shown in the example
code, sorry if that wasn't clear.
Nicolas
Le mardi 8 août 2017 20:56:
I don't understand what you mean by "forward the precomputed output"
What I would recommand is to make 1 op for the forward. The intermediate
value that can be reused for the gradient, make them output. Don't use them
in the forward, but you can reuse them your grad override.
Frédéric
On Mon, Ju