qq1243196045 opened a new issue, #15123: URL: https://github.com/apache/tvm/issues/15123
when I use tvmc ,something error occurred: ```wget https://github.com/onnx/models/raw/main/vision/classification/resnet/model/resnet50-v2-7.onnx tvmc compile --target "llvm" --output resnet50-v2-7-tvm.tar resnet50-v2-7.onnx ``` ``` WARNING:autotvm:One or more operators have not been tuned. Please tune your model for better performance. Use DEBUG logging level to see more details. Traceback (most recent call last): File "/root/anaconda3/envs/tvm-build/bin/tvmc", line 33, in <module> sys.exit(load_entry_point('tvm==0.13.dev217+g2d2b72733', 'console_scripts', 'tvmc')()) File "/projects/tvm/python/tvm/driver/tvmc/main.py", line 118, in main sys.exit(_main(sys.argv[1:])) File "/projects/tvm/python/tvm/driver/tvmc/main.py", line 106, in _main return args.func(args) File "/projects/tvm/python/tvm/driver/tvmc/compiler.py", line 217, in drive_compile **transform_args, File "/projects/tvm/python/tvm/driver/tvmc/compiler.py", line 421, in compile_model workspace_pools=workspace_pools, File "/projects/tvm/python/tvm/driver/tvmc/compiler.py", line 491, in build workspace_memory_pools=workspace_pools, File "/projects/tvm/python/tvm/relay/build_module.py", line 372, in build mod_name=mod_name, File "/projects/tvm/python/tvm/relay/build_module.py", line 169, in build mod_name, File "/projects/tvm/python/tvm/_ffi/_ctypes/packed_func.py", line 238, in __call__ raise get_last_ffi_error() tvm._ffi.base.TVMError: Traceback (most recent call last): 15: TVMFuncCall 14: tvm::relay::backend::RelayBuildModule::GetFunction(tvm::runtime::String const&, tvm::runtime::ObjectPtr<tvm::runtime::Object> const&)::{lambda(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)#3}::operator()(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) const 13: tvm::relay::backend::RelayBuildModule::Build(tvm::IRModule, tvm::runtime::Array<tvm::Target, void> const&, tvm::Target const&, tvm::relay::Executor const&, tvm::relay::Runtime const&, tvm::WorkspaceMemoryPools const&, tvm::ConstantMemoryPools const&, tvm::runtime::String) 12: tvm::relay::backend::RelayBuildModule::BuildRelay(tvm::IRModule, tvm::runtime::String const&) 11: tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<tvm::relay::backend::GraphExecutorCodegenModule::GetFunction(tvm::runtime::String const&, tvm::runtime::ObjectPtr<tvm::runtime::Object> const&)::{lambda(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)#2}> >::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) 10: tvm::relay::backend::GraphExecutorCodegen::Codegen(tvm::IRModule, tvm::relay::Function, tvm::runtime::String) 9: tvm::relay::GraphPlanMemory(tvm::relay::Function const&) 8: tvm::relay::StorageAllocator::Plan(tvm::relay::Function const&) 7: tvm::relay::ExprVisitor::VisitExpr(tvm::RelayExpr const&) 6: tvm::relay::ExprFunctor<void (tvm::RelayExpr const&)>::VisitExpr(tvm::RelayExpr const&) 5: tvm::relay::transform::DeviceAwareExprVisitor::VisitExpr_(tvm::relay::FunctionNode const*) 4: tvm::relay::StorageAllocaBaseVisitor::DeviceAwareVisitExpr_(tvm::relay::FunctionNode const*) 3: tvm::relay::StorageAllocaBaseVisitor::CreateToken(tvm::RelayExprNode const*, bool) 2: tvm::relay::StorageAllocator::CreateTokenOnDevice(tvm::RelayExprNode const*, tvm::VirtualDevice const&, bool) 1: tvm::relay::TokenAllocator1D::Alloc(tvm::relay::StorageToken*, long) 0: tvm::relay::TokenAllocator1D::GetMemorySize(tvm::relay::StorageToken*) File "/projects/tvm/src/relay/backend/token_allocator.cc", line 41 TVMError: --------------------------------------------------------------- An error occurred during the execution of TVM. For more information, please see: https://tvm.apache.org/docs/errors.html --------------------------------------------------------------- Check failed: (pval != nullptr) is false: Cannot allocate memory symbolic tensor shape [T.Any(), 3, 224, 224] ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
