coffezhou opened a new issue, #17869: URL: https://github.com/apache/tvm/issues/17869
### Expected behavior The onnx frontend should import the model correctly. ### Actual behavior When importing the following onnx model, tvm crashes as follows:  ```c Traceback (most recent call last): File "/home/carla/Documents/test/test.py", line 189, in <module> main() File "/home/carla/Documents/test/test.py", line 178, in main check_correctness(onnx_model) File "/home/carla/Documents/test/test.py", line 104, in check_correctness tvm_model = from_onnx(model, opset=opset, keep_params_in_input=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/carla/Documents/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", line 3690, in from_onnx return g.from_onnx(graph, opset) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/carla/Documents/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", line 3321, in from_onnx self._construct_nodes(graph) File "/home/carla/Documents/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", line 3496, in _construct_nodes op = self._convert_operator(op_name, inputs, attr, self.opset) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/carla/Documents/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", line 3596, in _convert_operator sym = op_function(self.bb, inputs, attrs, [self._nodes, self._params]) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/carla/Documents/tvm/python/tvm/relax/frontend/onnx/onnx_frontend.py", line 2972, in _impl_v11 raise NotImplementedError("Position must be a constant.") NotImplementedError: Position must be a constant. [11:48:06] /home/carla/Documents/tvm/src/relax/ir/block_builder.cc:64: Warning: BlockBuilder destroyed with remaining blocks! ``` As shown in the above picture, the position attribute is a constant tensor. Thus, this may be a bug of the onnx frontend in TVM. ### Environment OS: Ubuntu 20.04 TVM: 0.21.dev0(c00f52a70) ### Steps to reproduce This bug can be reproduced by the following code with the model in the attachment. As shown in the code, the model can be executed by onnxruntime, which indicates that this is a valid model. ```python import sys import numpy as np import onnx import onnxruntime import tvm from tvm import relax from tvm.relax.frontend.onnx import from_onnx import pickle def main(): onnx_model = onnx.load("a126.onnx") with open("inputs.pkl", "rb") as fp: inputs = pickle.load(fp) try: ort_session = onnxruntime.InferenceSession( onnx_model.SerializeToString(), providers=["CPUExecutionProvider"] ) ort_output = ort_session.run([], inputs) except Exception as e: print(e) sys.exit(1) tvm_model = from_onnx(onnx_model, keep_params_in_input=True) if __name__ == "__main__": main() ``` [testcase.zip](https://github.com/user-attachments/files/19831236/testcase.zip) ### Triage * needs-triage -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
