dtarakanov1 commented on issue #14589:
URL:
https://github.com/apache/incubator-mxnet/issues/14589#issuecomment-692018052
I get `Unrecognized attribute: spatial for operator BatchNormalization` in
one of my scenarios.
Quantize a ResNet model.
```Python
model = onnx.load_model('resnet18v2_opset8.onnx')
quantized_model = quant.quantize(model)
```
The model is converted with a warning:
```
Warning: The original model opset version is 7, which does not support
quantized operators.
The opset version of quantized model will be set to 10. Use onnx
model checker to verify model after quantization.
```
Save the model and try running inference
```Python
onnx.save_model(quantized_model, 'quantized_resnet.onnx')
sess = rt.InferenceSession('quantized_resnet.onnx')
```
```
onnxruntime.capi.onnxruntime_pybind11_state.InvalidGraph: [ONNXRuntimeError]
: 10 : INVALID_GRAPH : This is an invalid model. Error in
Node:resnetv22_batchnorm0_fwd : Unrecognized attribute: spatial for operator
BatchNormalization
```
Run model check, just in case:
```Python
onnx.checker.check_model(quantized_model)
```
Something different:
```
onnx.onnx_cpp2py_export.checker.ValidationError:
fixed_quantization_range_uint8 in initializer but not in graph input
```
If try `onnx_tf.backend.prepare` I get another problem, and if I convert the
model into a later opset I also get another problem.
Just wonder how to run a quantized model at all.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]