echuraev commented on issue #14615:
URL: https://github.com/apache/tvm/issues/14615#issuecomment-1508036638
Hi @jikechao! You are trying to compile your model for graph executor, but
it doesn't support dynamic shapes. I suppose, you have to use VirtualMachine
because it supports dynamic shapes:
```python
import onnx
import tvm
from tvm import relay
from tvm.runtime.vm import VirtualMachine
onnx_model_path = "model_unique_op.onnx"
model = onnx.load(onnx_model_path)
irmod, params = relay.frontend.from_onnx(model, {'X': (6,)},
freeze_params=True)
print(irmod)
with tvm.transform.PassContext(opt_level=3):
vmc = relay.vm.compile(model, target="llvm", params=params)
```
Unfortunately, this code also will fail with the following error:
```
Traceback (most recent call last):
File "script.py", line 12, in <module>
vmc = relay.vm.compile(model, target="llvm", params=params)
File "/home/echuraev/Workspace/OctoML/tvm/python/tvm/relay/backend/vm.py",
line 67, in compile
compiler.lower(mod, target, target_host)
File "/home/echuraev/Workspace/OctoML/tvm/python/tvm/relay/backend/vm.py",
line 127, in lower
self._lower(mod, raw_targets)
File
"/home/echuraev/Workspace/OctoML/tvm/python/tvm/_ffi/_ctypes/packed_func.py",
line 223, in __call__
values, tcodes, num_args = _make_tvm_args(args, temp_args)
File
"/home/echuraev/Workspace/OctoML/tvm/python/tvm/_ffi/_ctypes/packed_func.py",
line 188, in _make_tvm_args
raise TypeError("Don't know how to handle type %s" % type(arg))
TypeError: Don't know how to handle type <class
'onnx.onnx_ml_pb2.ModelProto'>
```
Probably there is a problem in ONNX importer. I'm not sure. Maybe
@AndrewZhaoLuo may know more about that.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]