larroy commented on a change in pull request #14613: [MXNET-978] Higher order
gradient support for some unary operators
URL: https://github.com/apache/incubator-mxnet/pull/14613#discussion_r285800871
##########
File path: src/imperative/imperative.cc
##########
@@ -347,8 +347,9 @@ std::vector<NDArray*> Imperative::Backward(
x_reqs.push_back(info.grad_req);
info.fresh_out_grad = true;
}
- CHECK_GT(xs.size(), 0)
- << "There are no inputs in computation graph that require gradients.";
+ if (xs.empty()) {
Review comment:
I think we should dive deeper into this one. Does it produce the warning
(or early the failure) for some of the test cases?
In the original code I think the intention is to get if there's any input
nodes which have gradient attached, I understand your explanation but what I
don't see is where would we store the gradient for such constants, is because
grad_req of the constant is kNullOp? the constant is just another node right?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services