kevinLu1114 opened a new issue #8495:
URL: https://github.com/apache/tvm/issues/8495


   
   > Description
   
   When I run the model(ssd_mobilenetv1), I found that the answer was wrong, 
and finally found that the problem was in [Expand 
](https://github.com/apache/tvm/blob/d3db5d65c9aefacda128756c15c7ec3f0a7b91ea/python/tvm/relay/frontend/onnx.py#L1982)
 operator, so I wrote a small script to reproduce the problem.
   
   In my case, my comment out code line was resolved. 
https://github.com/apache/tvm/blob/d3db5d65c9aefacda128756c15c7ec3f0a7b91ea/python/tvm/relay/frontend/onnx.py#L2038
 
   
   > Environment
   
   - tvm version: d3db5d65c9aefacda128756c15c7ec3f0a7b91ea
   - onnxruntime version: 1.8.1
   
   > Script
   
   ```python
   import tvm
   import onnx
   import numpy as np
   
   
   from tvm import relay
   from tvm.runtime.vm import VirtualMachine
   from tvm.testing import assert_allclose
   from onnx import TensorProto, helper
   from onnxruntime.backend import prepare
   
   
   # Create the outer network
   graph_proto = helper.make_graph(
       [
           helper.make_node("Expand", ["input", "shape"], ["output"]),
       ],
       "graph",
       [
           helper.make_tensor_value_info('shape', TensorProto.INT64, [1]),
       ],
       [
           helper.make_tensor_value_info("output", TensorProto.INT32, ['?']),
       ],
       initializer=
       [
           helper.make_tensor(
               'input', 
               TensorProto.INT32, 
               dims=[],
               vals=[0],
           )
       ],
   )
   model = helper.make_model(graph_proto)
   onnx.checker.check_model(model)
   i_shape = np.array([0], dtype=np.int64)
   
   shape_list = {'shape' : i_shape.shape}
   
   mod, params = relay.frontend.from_onnx(model, shape_list, freeze_params=True)
   
   target = "llvm"
   with tvm.transform.PassContext(opt_level=3):
       vm_exec = relay.vm.compile(mod, target=target, params=params)
       
   dev = tvm.cpu()
   vm = VirtualMachine(vm_exec, dev)
   
   
   tvm_out = vm.run(tvm.nd.array(i_shape))
   
   
   model_rep = prepare(model) 
   ort_out = model_rep.run(i_shape)[0]
   
   assert_allclose(ort_out, tvm_out)
   ```
   
   Expected output : 
   > `[]`
   
   Error output:  
   > `[0]`
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to