lee-bin opened a new issue, #13226:
URL: https://github.com/apache/tvm/issues/13226

   ### Expected behavior
   
   tvmc compile succeed.
   
   ### Actual behavior
   
   tvmc compile failed.
   
   ```Traceback (most recent call last):
     File "/Users/someone/anaconda3/envs/tvm/lib/python3.8/runpy.py", line 194, 
in _run_module_as_main
       return _run_code(code, main_globals, None,
     File "/Users/someone/anaconda3/envs/tvm/lib/python3.8/runpy.py", line 87, 
in _run_code
       exec(code, run_globals)
     File 
"/Users/someone/workspace/code/tvm/python/tvm/driver/tvmc/__main__.py", line 
24, in <module>
       tvmc.main.main()
     File "/Users/someone/workspace/code/tvm/python/tvm/driver/tvmc/main.py", 
line 115, in main
       sys.exit(_main(sys.argv[1:]))
     File "/Users/someone/workspace/code/tvm/python/tvm/driver/tvmc/main.py", 
line 103, in _main
       return args.func(args)
     File 
"/Users/someone/workspace/code/tvm/python/tvm/driver/tvmc/compiler.py", line 
180, in drive_compile
       compile_model(
     File 
"/Users/someone/workspace/code/tvm/python/tvm/driver/tvmc/compiler.py", line 
353, in compile_model
       graph_module = build(
     File 
"/Users/someone/workspace/code/tvm/python/tvm/driver/tvmc/compiler.py", line 
428, in build
       return relay.build(
     File "/Users/someone/workspace/code/tvm/python/tvm/relay/build_module.py", 
line 364, in build
       graph_json, runtime_mod, params = bld_mod.build(
     File "/Users/someone/workspace/code/tvm/python/tvm/relay/build_module.py", 
line 161, in build
       self._build(
     File 
"/Users/someone/workspace/code/tvm/python/tvm/_ffi/_ctypes/packed_func.py", 
line 237, in __call__
       raise get_last_ffi_error()
   tvm._ffi.base.TVMError: Traceback (most recent call last):
     [bt] (8) 9   libtvm.dylib                        0x000000013d17caf4 
tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<tvm::relay::backend::RelayBuildModule::GetFunction(std::__1::basic_string<char,
 std::__1::char_traits<char>, std::__1::allocator<char> > const&, 
tvm::runtime::ObjectPtr<tvm::runtime::Object> 
const&)::'lambda1'(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)> 
>::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, 
tvm::runtime::TVMRetValue*) + 40
     [bt] (7) 8   libtvm.dylib                        0x000000013d17cd90 
tvm::relay::backend::RelayBuildModule::GetFunction(std::__1::basic_string<char, 
std::__1::char_traits<char>, std::__1::allocator<char> > const&, 
tvm::runtime::ObjectPtr<tvm::runtime::Object> 
const&)::'lambda1'(tvm::runtime::TVMArgs, 
tvm::runtime::TVMRetValue*)::operator()(tvm::runtime::TVMArgs, 
tvm::runtime::TVMRetValue*) const + 656
     [bt] (6) 7   libtvm.dylib                        0x000000013d17d568 
tvm::relay::backend::RelayBuildModule::Build(tvm::IRModule, 
tvm::runtime::Array<tvm::Target, void> const&, tvm::Target const&, 
tvm::relay::Executor const&, tvm::relay::Runtime const&, 
tvm::WorkspaceMemoryPools const&, tvm::ConstantMemoryPools const&, 
tvm::runtime::String) + 1028
     [bt] (5) 6   libtvm.dylib                        0x000000013d17e18c 
tvm::relay::backend::RelayBuildModule::BuildRelay(tvm::IRModule, 
tvm::runtime::String const&) + 2512
     [bt] (4) 5   libtvm.dylib                        0x000000013bac85ec 
tvm::TIRToRuntime(tvm::runtime::Map<tvm::Target, tvm::IRModule, void, void> 
const&, tvm::Target const&) + 2884
     [bt] (3) 4   libtvm.dylib                        0x000000013c7ea53c 
tvm::codegen::Build(tvm::IRModule, tvm::Target) + 1392
     [bt] (2) 3   libtvm.dylib                        0x000000013d424a70 
tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<void 
tvm::runtime::TypedPackedFunc<tvm::runtime::Module (tvm::IRModule, 
tvm::Target)>::AssignTypedLambda<tvm::codegen::$_4>(tvm::codegen::$_4, 
std::__1::basic_string<char, std::__1::char_traits<char>, 
std::__1::allocator<char> >)::'lambda'(tvm::runtime::TVMArgs const&, 
tvm::runtime::TVMRetValue*)> >::Call(tvm::runtime::PackedFuncObj const*, 
tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) + 872
     [bt] (1) 2   libtvm.dylib                        0x000000013d421334 
tvm::codegen::LLVMModuleNode::Init(tvm::IRModule const&, tvm::Target const&) + 
3604
     [bt] (0) 1   libtvm.dylib                        0x000000013b803a7c 
tvm::runtime::detail::LogFatal::Entry::Finalize() + 84
     File "/Users/someone/workspace/code/tvm/src/target/llvm/llvm_module.cc", 
line 331
   TVMError: LLVM module verification failed with the following errors: 
   Incorrect alignment of return type to called function!
     %30 = tail call <4320 x float> @llvm.fmuladd.v4320f32(<4320 x float> %24, 
<4320 x float> %29, <4320 x float> %21)
   Incorrect alignment of argument passed to called function!
     %30 = tail call <4320 x float> @llvm.fmuladd.v4320f32(<4320 x float> %24, 
<4320 x float> %29, <4320 x float> %21)
   Incorrect alignment of argument passed to called function!
     %30 = tail call <4320 x float> @llvm.fmuladd.v4320f32(<4320 x float> %24, 
<4320 x float> %29, <4320 x float> %21)
   Incorrect alignment of argument passed to called function!
     %30 = tail call <4320 x float> @llvm.fmuladd.v4320f32(<4320 x float> %24, 
<4320 x float> %29, <4320 x float> %21)
   ```
   
   ### Environment
   
   macOS Monterey, Apple M1 pro
   TVM version: v0.10.0, 7b50b2d0ddf45d9114715aad16b867e8be6b2230
   
   ### Steps to reproduce
   
   The onnx model is 
[here](https://github.com/lee-bin/test/blob/master/onnx-example.onnx).
   ```
   python -m tvm.driver.tvmc compile --target "llvm" --input-shapes 
"X:[1,32,4320]" --output onnx-example-tvm.tar onnx-example.onnx
   ```
   
   ### Triage
   
   Please refer to the list of label tags 
[here](https://github.com/apache/tvm/wiki/Issue-Triage-Labels) to find the 
relevant tags and add them below in a bullet format (example below).
   
   * backend: llvm
   * core:ffi
   * frontend:onnx
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to