MoisesHer opened a new pull request #18092: Add gelu fuse ops (#18082) URL: https://github.com/apache/incubator-mxnet/pull/18092 * Add LeakyReLU:Gelu (fwd and bwd) to fused ops * Add test LeakyReLU:gelu * cpplint * fix lint * fix bug SQRT_2 using constant memory * add comments ## Description ## This PR adds into elementwise fused-operators LeakyReLU:gelu ## Checklist ## ### Essentials ### Please feel free to remove inapplicable items for your PR. - [x] Changes are complete (i.e. I finished coding on this PR) - [x] All changes have test coverage: - Unit tests are added for small changes to verify correctness (e.g. adding a new operator) - [x] Code is well-documented: - [x] To the best of my knowledge, examples are either not affected by this change, or have been fixed to be compatible with this change ### Changes ### - [x] In pointwise fusion pass, set as fusion-compatible both LeakyReLU and _backward_LeakyReLU if activation type is in the provided list (only "gelu" for now) - [x] In the fused-ops code generator, include functions for LeakyReLU:gelu and _backward_LeakyReLU:gelu (they are equivalent to mshadow_op.h version) - [x] Included test into tests/python/gpu/test_fusion.py ## Comments ##
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services
