ChaiBapchya commented on issue #17805: fixing batch_norm and layer_norm for 
large tensor nightly test
URL: https://github.com/apache/incubator-mxnet/pull/17805#issuecomment-597270999
 
 
   1. How is addition of SHAPE_ASSIGN_CHECK to layer_norm causing this failure?
   Layer norm/batch norm were passing before and some change caused it to start 
to fail right? What's that root cause?
   
   2. Also it turns out - batch norm already has shape check in 
test_large_array.py
   
https://github.com/apache/incubator-mxnet/blob/afb8742e6e1e987833b39c487dc892b5537196a1/tests/nightly/test_large_array.py#L327
   
   Layer norm doesn't have such a check in test_large_array.py. Maybe you could 
add that. 
   
   Fundamentally, For both batch norm and layer norm, since the operation is 
just to perform normalization over layer/batch, input shape should be equal to 
output shape.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to