ehsanmok commented on code in PR #13074:
URL: https://github.com/apache/tvm/pull/13074#discussion_r995954213
##########
python/tvm/relay/frontend/onnx.py:
##########
@@ -944,6 +946,36 @@ def _impl_v1(cls, inputs, attr, params):
return Gelu._impl_v1([inp], attr, params)
+class LayerNormalization(OnnxOpConverter):
Review Comment:
The main reason is `nn.layer_norm` doesn't output the intermediate expr such
as `mean` and `inv_stdev` needed for onnx output which are about half of the
computation. Besides, I first tried with the existing utility `layer_norm` in
onnx.py used in `EmbedLayerNormalization` and `SkipLayerNormalization` but that
wasn't correct for the `LayerNormalization` because of the wide axes setup so
had to reimplement based on onnx spec. I think there should be refactoring
opportunity that I'll make a note for later.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]