connorgoggins opened a new pull request #17475: Implement remaining 
nn_activation ops in opperf
URL: https://github.com/apache/incubator-mxnet/pull/17475
 
 
   ## Description ##
   This PR serves to implement the remaining operators from the nn_activation 
category in opperf. To achieve this, I refactored the preexisting individual 
`run_performance_test` calls (for the four operators that had already been 
implemented) into a single generalized function call to `run_op_benchmarks`. I 
also implemented Softmax, SoftmaxActivation, softmin, and Activation ops, which 
are also called via the `run_op_benchmarks` function.
   
   ## Checklist ##
   ### Essentials ###
   Please feel free to remove inapplicable items for your PR.
   - [x] Changes are complete (i.e. I finished coding on this PR)
   - [x] All changes have test coverage
   - [x] Code is well-documented
   - [x] To the best of my knowledge, examples are either not affected by this 
change, or have been fixed to be compatible with this change
   
   ### Changes ###
   - M benchmark/opperf/nd_operations/nn_activation_operators.py
   - M benchmark/opperf/rules/default_params.py
   - M benchmark/opperf/utils/op_registry_utils.py
   
   ## Comments ##
   Tested on c5.18xl-ubuntu 16.04 and Mac OS with:
   1. Checkout branch and call function `run_activation_operators_benchmarks` - 
runs all activation ops on default data
   2. Checkout branch and run `opperf.py` (full run of all ops)

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to