larroy commented on a change in pull request #14613: [MXNET-978] Higher order
gradient support for some unary operators
URL: https://github.com/apache/incubator-mxnet/pull/14613#discussion_r290913262
##########
File path: src/operator/tensor/elemwise_unary_op_trig.cc
##########
@@ -46,7 +46,33 @@ The storage type of ``sin`` output depends upon the input
storage type:
)code" ADD_FILELINE)
.set_attr<nnvm::FGradient>("FGradient", ElemwiseGradUseIn{ "_backward_sin" });
-MXNET_OPERATOR_REGISTER_BINARY_WITH_SPARSE_CPU_DR(_backward_sin,
unary_bwd<mshadow_op::sin_grad>);
+MXNET_OPERATOR_REGISTER_BINARY_WITH_SPARSE_CPU_DR(_backward_sin,
unary_bwd<mshadow_op::sin_grad>)
+.set_attr<nnvm::FGradient>("FGradient",
+ [](const nnvm::NodePtr& n, const std::vector<nnvm::NodeEntry>& ograds) {
+ // ograds[0]: d^2L/dx^2
+ // inputs[0]: dL/dy
+ // inputs[1]: x (ElemwiseUseIn)
+ // f(x) = sin(x)
+ // f'(x) = cos(x)
+ // f''(x) = -sin(x)
+ auto x_grad = MakeNode("cos", n->attrs.name + "_x_grad",
+ {n->inputs[1]}, nullptr, &n);
+ auto x_grad_grad = MakeNode("negative", n->attrs.name + "_x_grad_grad",
Review comment:
as long as the nodes are themselves differentiable, then it would support
additional differentiation. For some complex functions having just 2nd gradient
should be good. In this case I would say it's N times differentiable. the
_x_grad_grad is just the name of the node as far as I understand it. Why does
it concern you?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services