sxjscience commented on issue #17654: [LayerNorm] Missing the mismatch cues of 
in_channels
URL: 
https://github.com/apache/incubator-mxnet/issues/17654#issuecomment-589741619
 
 
   @ZheyuYe The C++ side implementation of the shape inferring logic is here: 
https://github.com/apache/incubator-mxnet/blob/9dcf71d8fe33f77ed316a95fcffaf1f7f883ff70/src/operator/nn/layer_norm.cc#L39-L66
   
   The python side is here: 
https://github.com/apache/incubator-mxnet/blob/9dcf71d8fe33f77ed316a95fcffaf1f7f883ff70/python/mxnet/gluon/nn/basic_layers.py#L609-L614
   
   The problem should be to check the shape of gamma and beta:
   
https://github.com/apache/incubator-mxnet/blob/9dcf71d8fe33f77ed316a95fcffaf1f7f883ff70/src/operator/nn/layer_norm.cc#L56-L57
   
   Would you try to investigate the issue? You can append `std::cout << 
in_shape->at(layernorm::kGamma)`, which should not be empty when `in_channel` 
is given.
   
   I think one way to solve the prioblem is to use the same 
`SHAPE_ASSIGN_CHECK` as here:
   
https://github.com/apache/incubator-mxnet/blob/9dcf71d8fe33f77ed316a95fcffaf1f7f883ff70/src/operator/numpy/np_where_op.cc#L42
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to