zhuzilin commented on a change in pull request #8056:
URL: https://github.com/apache/tvm/pull/8056#discussion_r634912195
##########
File path: python/tvm/relay/op/nn/nn.py
##########
@@ -2973,6 +2973,40 @@ def cross_entropy_with_logits(predictions, targets):
return _make.cross_entropy_with_logits(predictions, targets)
+def nll_loss(predictions, targets, weights, reduction="mean",
ignore_index=-100):
+ """Negative log likelihood loss.
+
+ output{n, i_1, i_2, ..., i_k} = predictions{n, t, i_1, i_2, i_k}
+ where t = target{n, i_1, i_2, ..., i_k}
+
+ result = reduction(output)
+
+ Parameters
+ ----------
+ predictions : tvm.relay.Expr
+ The predictions.
+
+ targets : tvm.relay.Expr
+ The target value of each prediction.
+
+ weights : tvm.relay.Expr
Review comment:
@altanh We can make weights an optional parameter. I wonder if there are
any example of a relay op with an optional tensor parameter that I can learn
from. And also, how should we deal with gradient of an optional parameter? BTW,
is there any better way we can mark a parameter as "no need for gradient"
instead of returning an `one_like` grad?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]