Thrsu opened a new issue, #15651:
URL: https://github.com/apache/tvm/issues/15651

   The ONNX 
[model](https://drive.google.com/file/d/1OyCdBhPRFo3MXvZSLsth4DK4cb7Ui5Z9/view?usp=share_link)
 produced inconsistent inference results with ONNX when using relax to load 
model and obtain the results.
   
   <img width="561" alt="Gather_onnx" 
src="https://github.com/apache/tvm/assets/89128704/6e4bc053-79ad-48fd-a6bf-92013b853fb9";>
   
   
   ### Actual behavior
   
   ```
   Traceback (most recent call last):
     ...
       np.testing.assert_allclose(onnx_output[0], tvm_output, atol=1e-3, 
rtol=1e-3)
     File 
"/workplace/software/miniconda3/envs/tflite/lib/python3.8/site-packages/numpy/testing/_private/utils.py",
 line 1527, in assert_allclose
       assert_array_compare(compare, actual, desired, err_msg=str(err_msg),
     File 
"/workplace/software/miniconda3/envs/tflite/lib/python3.8/site-packages/numpy/testing/_private/utils.py",
 line 844, in assert_array_compare
       raise AssertionError(msg)
   AssertionError: 
   Not equal to tolerance rtol=0.001, atol=0.001
   
   Mismatched elements: 24 / 72 (33.3%)
   Max absolute difference: 0.92875844
   Max relative difference: 31.090857
    x: array([[[[0.228867, 0.563567],
            [0.655737, 0.30893 ],
            [0.737935, 0.313822]],...
    y: array([[[[0.321285, 0.733162],
            [0.199019, 0.381254],
            [0.461184, 0.895627]],...
   ```
   
   ### Environment
   
   - TVM: almost latest version of unity branch
   - onnx: 1.14.0
   - onnxruntime: 1.14.1
   
   ### Steps to reproduce
   
   ```python
   import onnx
   import numpy as np
   import onnxruntime
   
   import tvm
   from tvm import relax
   from tvm.relax.frontend.onnx import from_onnx
   
   input_data = {}
   input_data['data'] = np.random.rand(5, 4, 3, 2).astype(np.float32)
   input_data['indices'] = np.random.randint(-2, 3, [3], dtype=np.int64)
   
   onnx_model_path = "Gather.onnx"
   model = onnx.load(onnx_model_path)
   
   sess = onnxruntime.InferenceSession(model.SerializeToString())
   onnx_output = sess.run(None, input_data)
   
   tvm_model = from_onnx(model, opset=18, keep_params_in_input=True)
   tvm_model = relax.transform.DecomposeOpsForInference()(tvm_model)
   tvm_model = relax.transform.LegalizeOps()(tvm_model)
   
   tvm_model, params = relax.frontend.detach_params(tvm_model)
   with tvm.transform.PassContext(opt_level=3):
       ex = relax.build(tvm_model, target="llvm")
       vm = relax.VirtualMachine(ex, tvm.cpu())
   
   inputs = {}
   inputs['data'] =  tvm.nd.array(input_data['data'].astype(np.float32))
   inputs['indices'] = tvm.nd.array(input_data['indices'].astype(np.int64))
   input_tvm = [
       inputs[key.name_hint] for key in tvm_model["main"].params if 
key.name_hint in inputs
   ]
   
   if params:
       input_tvm += params["main"]
   
   vm.set_input("main", *input_tvm)
   vm.invoke_stateful("main")
   tvm_output = vm.get_outputs("main").numpy()
   
   np.testing.assert_allclose(onnx_output[0], tvm_output, atol=1e-3, rtol=1e-3)
   ```
   
   ### Triage
   
   * needs-triage
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to