kevinthesun commented on issue #7946: Defining Residual Convolution Structures 
in Block from gluon Module
URL: 
https://github.com/apache/incubator-mxnet/issues/7946#issuecomment-330987090
 
 
   1. self.ramp(self.conv(x)) vs mx.gluon.nn.Conv1D(activation='relu')(x) Yes. 
The latter applies a relu activation to the output of Conv1D.
   
   2. mx.gluon.nn.Sequential is for grouping multiple layers into a block. 
Usually you don't need to explicitly define each layer as a class attribute. 
You can create a list to store all the layers you want to group and use a for 
loop to add all list elements into mx.gluon.nn.Sequential object.
   
   3. Yes. Call forward on mx.gluon.nn.Sequential is equal to call forward on 
all child blocks, with topological order of computation graph.
 
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to