chayliu-ecarx commented on issue #17193:
URL: https://github.com/apache/tvm/issues/17193#issuecomment-2247309555

   Ok, thanks very much for your reply.
   
   I am tring to import and build a onnx model 
   ```
   import numpy as np
   import pytest
   from tvm import relay
   import onnx
   
   onnx_model = onnx.load("mobilenetv2-7.onnx")
   input_name = "data"
   data_shape = (1, 3, 224, 224)
   shape_dict = {input_name:data_shape }
   
   relay_mod, params = relay.frontend.from_onnx(onnx_model, shape_dict)
   print(relay_mod)
   
   target = tvm.target.hexagon('v66')
   with tvm.transform.PassContext(opt_level=3):
       lib = relay.build(relay_mod, tvm.target.Target(target, host=target), 
params=params, mod_name="default")
       
   with open('mobilenetv2-7.json', 'w') as f:
       f.write(lib.get_graph_json())
   lib.get_lib().save('mobilenetv2-7.so')
   ```
   
   But, It still failed:
   ```
   Traceback (most recent call last):
     File "build_hexagon.py", line 17, in <module>
       lib = relay.build(relay_mod, tvm.target.Target(target, host=target), 
params=params, mod_name="default")
     File "tvm/python/tvm/relay/build_module.py", line 364, in build
       graph_json, runtime_mod, params = bld_mod.build(
     File "tvm/python/tvm/relay/build_module.py", line 161, in build
       self._build(
     File "tvm/python/tvm/_ffi/_ctypes/packed_func.py", line 240, in __call__
       raise_last_ffi_error()
     File "tvm/python/tvm/_ffi/base.py", line 481, in raise_last_ffi_error
       raise py_err
   tvm.error.InternalError: Traceback (most recent call last):
     5: tvm::relay::backend::RelayBuildModule::GetFunction(tvm::runtime::String 
const&, tvm::runtime::ObjectPtr<tvm::runtime::Object> 
const&)::{lambda(tvm::runtime::TVMArgs, 
tvm::runtime::TVMRetValue*)#3}::operator()(tvm::runtime::TVMArgs, 
tvm::runtime::TVMRetValue*) const
     4: tvm::relay::backend::RelayBuildModule::BuildRelay(tvm::IRModule, 
tvm::runtime::String const&)
     3: tvm::TIRToRuntime(tvm::runtime::Map<tvm::Target, tvm::IRModule, void, 
void> const&, tvm::Target const&)
     2: tvm::codegen::Build(tvm::IRModule, tvm::Target)
     1: 
tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<tvm::runtime::TypedPackedFunc<tvm::runtime::Module
 (tvm::IRModule, tvm::Target)>::AssignTypedLambda<tvm::runtime::Module 
(*)(tvm::IRModule, tvm::Target)>(tvm::runtime::Module (*)(tvm::IRModule, 
tvm::Target), std::__cxx11::basic_string<char, std::char_traits<char>, 
std::allocator<char> >)::{lambda(tvm::runtime::TVMArgs const&, 
tvm::runtime::TVMRetValue*)#1}> >::Call(tvm::runtime::PackedFuncObj const*, 
tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)
     0: tvm::codegen::BuildHexagon(tvm::IRModule, tvm::Target)
     File "tvm/src/target/llvm/codegen_hexagon.cc", line 642
   InternalError: Check failed: (f != nullptr) is false: 
tvm.contrib.hexagon.link_shared does not to exist, do import tvm.contrib.hexagon
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to