grygielski commented on issue #19218:
URL:
https://github.com/apache/incubator-mxnet/issues/19218#issuecomment-700650490
Hi again @buaalsy2003. Problem here is inside PReLu activation functions.
During training your model, some `gamma` parameters reached very low,
denormalized floating point values. It slows down the execution of computations
on CPU. I need some time to prepare proper solution for that but if you need
quick fix to run your trained model on CPU, just iterate through parameter dict
and set low `prelu` params to 0. Try choosing the right threshold so you won't
lose accuracy.
Example code:
```Python
def fix_denorm_params():
global arg_params
for key in arg_params.keys():
if 'prelu' in key:
gammas = arg_params[key]
for index, gamma in enumerate(gammas):
if abs(gamma) < 1e-20:
arg_params[key][index] = 0.
...
def run_inference():
out = executor.forward(is_train=False, data=sample)
pass
sym, arg_params, aux_params = mx.model.load_checkpoint('model-reduce', 23)
fix_denorm_params()
sample = mx.ndarray.zeros((1, 3, 112, 112))
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]