XJDKC commented on a change in pull request #697:
URL: https://github.com/apache/singa/pull/697#discussion_r433587855
##########
File path: examples/mlp/module.py
##########
@@ -56,10 +56,9 @@ def forward(self, inputs):
x = autograd.add_bias(x, self.b1)
return x
- def loss(self, out, ty):
- return autograd.softmax_cross_entropy(out, ty)
-
- def optim(self, loss, dist_option, spars):
+ def train_one_batch(self, x, y, dist_option, spars):
+ out = self.forward(x)
+ loss = autograd.softmax_cross_entropy(out, y)
Review comment:
Are there other operators besides ReLU that need to be encapsulated by
Layer?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]