larroy commented on a change in pull request #15120: [bug] fix higher grad log
URL: https://github.com/apache/incubator-mxnet/pull/15120#discussion_r290526536
##########
File path: src/operator/tensor/elemwise_unary_op_basic.cc
##########
@@ -1074,16 +1074,19 @@
MXNET_OPERATOR_REGISTER_BINARY_WITH_SPARSE_CPU_DR(_backward_log,
[](const nnvm::NodePtr& n, const std::vector<nnvm::NodeEntry>& ograds) {
// For f(x) -> f = log
// f''(x) = -1 * (f'(x) * f'(x))
- auto gx = nnvm::NodeEntry{n};
+ auto gx_mul_head_grads = nnvm::NodeEntry{n}; // f'(x) * head_grads
+ auto head_grads = nnvm::NodeEntry{n->inputs[0]};
+ auto g_lx = MakeNode("reciprocal", n->attrs.name + "_backward_log_grad",
+ {n->inputs[1]}, nullptr, &n);
auto ggx_mid = MakeNode("elemwise_mul", n->attrs.name +
"_backward_mid_grad_grad",
- {gx, gx}, nullptr, &n);
+ {gx_mul_head_grads, nnvm::NodeEntry{g_lx}},
nullptr, &n);
auto ggx = MakeNode("negative", n->attrs.name + "_backward_grad_grad",
{nnvm::NodeEntry{ggx_mid}}, nullptr, &n);
std::vector<nnvm::NodeEntry> ret;
ret.emplace_back(MakeNode("elemwise_mul", n->attrs.name +
"_backward_grad_grad",
- {ograds[0], gx}, nullptr, &n));
+ {ograds[0], nnvm::NodeEntry{g_lx}}, nullptr, &n));
Review comment:
Hi. What do you mean by head_grads.grad? NodeEntry doesn't have a grad
field. Could you clarify? Are you referring to the python code below? The
gradient is always 0 when attach_grad() is called. The value is updated after
running backward on an output, or using autograd.grad.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services