jikechao opened a new issue, #14660:
URL: https://github.com/apache/tvm/issues/14660
TVM loads onnx model with `split` operator leads to a crash when loading it
to TVM using the following script. It threw "indices_or_sections need to be
able to divide input.shape[axis]"
```
import onnx
from tvm import relay
onnx_model_path = "split_2d_model.onnx"
model = onnx.load(onnx_model_path)
irmod, params = relay.frontend.from_onnx(model, {'input': (2, 8)},
freeze_params=True)
```
### Step to reproduce
1. download the
[split_2d_model.onnx](https://github.com/jikechao/onnx_models/blob/main/split_2d_model.onnx)
2. run the above script
### Crash Message/Traceback:
```
irmod, params = relay.frontend.from_onnx(model, {'input': (2, 8)},
freeze_params=True)
File "/workplace/software/tvm/tvm/python/tvm/relay/frontend/onnx.py", line
7346, in from_onnx
mod, params = g.from_onnx(graph, opset)
File "/workplace/software/tvm/tvm/python/tvm/relay/frontend/onnx.py", line
6961, in from_onnx
self._check_user_inputs_in_outermost_graph_scope()
File "/workplace/software/tvm/tvm/python/tvm/relay/frontend/onnx.py", line
7038, in _check_user_inputs_in_outermost_graph_scope
self._shape
AssertionError: User specified the shape for inputs that weren't found in
the graph: {'input': (2, 8)}
(tf2) root@R730-3:/share_host/TVMFT/BorrowTests/ONNX/bugs# python
24_split_axis.py
Traceback (most recent call last):
File "24_split_axis.py", line 6, in <module>
model = onnx.load(onnx_model_path)
File
"/workplace/software/miniconda3/envs/tf2/lib/python3.6/site-packages/onnx/__init__.py",
line 118, in load_model
s = _load_bytes(f)
File
"/workplace/software/miniconda3/envs/tf2/lib/python3.6/site-packages/onnx/__init__.py",
line 32, in _load_bytes
with open(cast(Text, f), 'rb') as readable:
FileNotFoundError: [Errno 2] No such file or directory:
'/workplace/software/onnx/onnx/backend/test/data/node/test_split_2d_uneven_split_opset/model.onnx'
(tf2) root@R730-3:/share_host/TVMFT/BorrowTests/ONNX/bugs# python
24_split_axis.py
Traceback (most recent call last):
File "24_split_axis.py", line 6, in <module>
model = onnx.load(onnx_model_path)
File
"/workplace/software/miniconda3/envs/tf2/lib/python3.6/site-packages/onnx/__init__.py",
line 118, in load_model
s = _load_bytes(f)
File
"/workplace/software/miniconda3/envs/tf2/lib/python3.6/site-packages/onnx/__init__.py",
line 32, in _load_bytes
with open(cast(Text, f), 'rb') as readable:
FileNotFoundError: [Errno 2] No such file or directory:
'/workplace/software/onnx/onnx/backend/test/data/node/test_split_2d_uneven_split_opset/model.onnx'
(tf2) root@R730-3:/share_host/TVMFT/BorrowTests/ONNX/bugs# python
24_split_axis.py
Traceback (most recent call last):
File "24_split_axis.py", line 6, in <module>
model = onnx.load(onnx_model_path)
File
"/workplace/software/miniconda3/envs/tf2/lib/python3.6/site-packages/onnx/__init__.py",
line 118, in load_model
s = _load_bytes(f)
File
"/workplace/software/miniconda3/envs/tf2/lib/python3.6/site-packages/onnx/__init__.py",
line 32, in _load_bytes
with open(cast(Text, f), 'rb') as readable:
FileNotFoundError: [Errno 2] No such file or directory:
'workplace/software/onnx/onnx/backend/test/data/node/test_split_2d_uneven_split_opset18/model.onnx'
(tf2) root@R730-3:/share_host/TVMFT/BorrowTests/ONNX/bugs# vi
24_split_axis.py
(tf2) root@R730-3:/share_host/TVMFT/BorrowTests/ONNX/bugs# python
24_split_axis.py
/workplace/software/tvm/tvm/python/tvm/relay/frontend/onnx.py:7315:
UserWarning: Your model ir_version is higher than the checker's.
warnings.warn(str(e))
Traceback (most recent call last):
File "24_split_axis.py", line 7, in <module>
irmod, params = relay.frontend.from_onnx(model, {'input': (2, 8)},
freeze_params=True)
File "/workplace/software/tvm/tvm/python/tvm/relay/frontend/onnx.py", line
7352, in from_onnx
mod = relay.transform.DynamicToStatic()(mod)
File "/workplace/software/tvm/tvm/python/tvm/ir/transform.py", line 160,
in __call__
return _ffi_transform_api.RunPass(self, mod)
File "/workplace/software/tvm/tvm/python/tvm/_ffi/_ctypes/packed_func.py",
line 237, in __call__
raise get_last_ffi_error()
tvm._ffi.base.TVMError: Traceback (most recent call last):
17: TVMFuncCall
16:
tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<tvm::runtime::TypedPackedFunc<tvm::IRModule
(tvm::transform::Pass,
tvm::IRModule)>::AssignTypedLambda<tvm::transform::$_6>(tvm::transform::$_6,
std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char>
>)::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}>
>::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs,
tvm::runtime::TVMRetValue*)
15: tvm::transform::Pass::operator()(tvm::IRModule) const
14: tvm::transform::Pass::operator()(tvm::IRModule,
tvm::transform::PassContext const&) const
13: tvm::relay::transform::FunctionPassNode::operator()(tvm::IRModule,
tvm::transform::PassContext const&) const
12:
tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<tvm::runtime::TypedPackedFunc<tvm::relay::Function
(tvm::relay::Function, tvm::IRModule,
tvm::transform::PassContext)>::AssignTypedLambda<tvm::relay::transform::DynamicToStatic()::$_0>(tvm::relay::transform::DynamicToStatic()::$_0)::{lambda(tvm::runtime::TVMArgs
const&, tvm::runtime::TVMRetValue*)#1}> >::Call(tvm::runtime::PackedFuncObj
const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)
11: tvm::relay::DynamicToStatic(tvm::relay::Function, tvm::IRModule)
10: tvm::relay::DynamicToStaticMutator::PrepareInput(tvm::RelayExpr const&)
9: tvm::transform::Pass::operator()(tvm::IRModule) const
8: tvm::transform::Pass::operator()(tvm::IRModule,
tvm::transform::PassContext const&) const
7: tvm::relay::transform::FunctionPassNode::operator()(tvm::IRModule,
tvm::transform::PassContext const&) const
6: tvm::transform::Pass::operator()(tvm::IRModule) const
5: tvm::transform::Pass::operator()(tvm::IRModule,
tvm::transform::PassContext const&) const
4: tvm::transform::ModulePassNode::operator()(tvm::IRModule,
tvm::transform::PassContext const&) const
3:
tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<tvm::runtime::TypedPackedFunc<tvm::IRModule
(tvm::IRModule,
tvm::transform::PassContext)>::AssignTypedLambda<tvm::relay::transform::InferType()::$_2>(tvm::relay::transform::InferType()::$_2)::{lambda(tvm::runtime::TVMArgs
const&, tvm::runtime::TVMRetValue*)#1}> >::Call(tvm::runtime::PackedFuncObj
const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)
2: tvm::relay::TypeInferencer::Infer(tvm::GlobalVar, tvm::relay::Function)
1: tvm::relay::TypeSolver::Solve()
0: _ZN3tvm7runtime6detail
21: TVMFuncCall
20:
tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<tvm::runtime::TypedPackedFunc<tvm::IRModule
(tvm::transform::Pass,
tvm::IRModule)>::AssignTypedLambda<tvm::transform::$_6>(tvm::transform::$_6,
std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char>
>)::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}>
>::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs,
tvm::runtime::TVMRetValue*)
19: tvm::transform::Pass::operator()(tvm::IRModule) const
18: tvm::transform::Pass::operator()(tvm::IRModule,
tvm::transform::PassContext const&) const
17: tvm::relay::transform::FunctionPassNode::operator()(tvm::IRModule,
tvm::transform::PassContext const&) const
16:
tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<tvm::runtime::TypedPackedFunc<tvm::relay::Function
(tvm::relay::Function, tvm::IRModule,
tvm::transform::PassContext)>::AssignTypedLambda<tvm::relay::transform::DynamicToStatic()::$_0>(tvm::relay::transform::DynamicToStatic()::$_0)::{lambda(tvm::runtime::TVMArgs
const&, tvm::runtime::TVMRetValue*)#1}> >::Call(tvm::runtime::PackedFuncObj
const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)
15: tvm::relay::DynamicToStatic(tvm::relay::Function, tvm::IRModule)
14: tvm::relay::DynamicToStaticMutator::PrepareInput(tvm::RelayExpr const&)
13: tvm::transform::Pass::operator()(tvm::IRModule) const
12: tvm::transform::Pass::operator()(tvm::IRModule,
tvm::transform::PassContext const&) const
11: tvm::relay::transform::FunctionPassNode::operator()(tvm::IRModule,
tvm::transform::PassContext const&) const
10: tvm::transform::Pass::operator()(tvm::IRModule) const
9: tvm::transform::Pass::operator()(tvm::IRModule,
tvm::transform::PassContext const&) const
8: tvm::transform::ModulePassNode::operator()(tvm::IRModule,
tvm::transform::PassContext const&) const
7:
tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<tvm::runtime::TypedPackedFunc<tvm::IRModule
(tvm::IRModule,
tvm::transform::PassContext)>::AssignTypedLambda<tvm::relay::transform::InferType()::$_2>(tvm::relay::transform::InferType()::$_2)::{lambda(tvm::runtime::TVMArgs
const&, tvm::runtime::TVMRetValue*)#1}> >::Call(tvm::runtime::PackedFuncObj
const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)
6: tvm::relay::TypeInferencer::Infer(tvm::GlobalVar, tvm::relay::Function)
5: tvm::relay::TypeSolver::Solve()
4: tvm::TypedEnvFunc<bool (tvm::runtime::Array<tvm::Type, void> const&,
int, tvm::Attrs const&, tvm::TypeReporter
const&)>::operator()(tvm::runtime::Array<tvm::Type, void> const&, int,
tvm::Attrs const&, tvm::TypeReporter const&) const
3: _ZN3tvm7runtime13Pac
2: tvm::runtime::TypedPackedFunc<bool (tvm::runtime::Array<tvm::Type,
void> const&, int, tvm::Attrs const&, tvm::TypeReporter
const&)>::AssignTypedLambda<bool (*)(tvm::runtime::Array<tvm::Type, void>
const&, int, tvm::Attrs const&, tvm::TypeReporter const&)>(bool
(*)(tvm::runtime::Array<tvm::Type, void> const&, int, tvm::Attrs const&,
tvm::TypeReporter const&))::{lambda(tvm::runtime::TVMArgs const&,
tvm::runtime::TVMRetValue*)#1}::operator()(tvm::runtime::TVMArgs const&,
tvm::runtime::TVMRetValue*) const
1: tvm::relay::SplitRel(tvm::runtime::Array<tvm::Type, void> const&, int,
tvm::Attrs const&, tvm::TypeReporter const&)
0: _ZN3tvm7runtime6detail
File "/workplace/software/tvm/tvm/src/relay/analysis/type_solver.cc", line
643
TVMError:
---------------------------------------------------------------
An error occurred during the execution of TVM.
For more information, please see: https://tvm.apache.org/docs/errors.html
---------------------------------------------------------------
Check failed: (false) is false: [15:08:19]
/workplace/software/tvm/tvm/src/relay/op/tensor/transform.cc:3046:
---------------------------------------------------------------
An error occurred during the execution of TVM.
For more information, please see: https://tvm.apache.org/docs/errors.html
---------------------------------------------------------------
Check failed: (reporter->Assert(indexmod(data->shape[axis],
sections->value) == tir::make_zero(DataType::Int(64)))) is false:
indices_or_sections need to be able to divide input.shape[axis]
```
### Triage
* frontend:onnx
* relay:op
* relay:analysis
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]