NataliaTabirca opened a new issue #9423:
URL: https://github.com/apache/tvm/issues/9423
I am using the tvm version from the main branch in order to test simple
models on my mimxrt1060_evk board. I used the script provided in the
documentation for sine_model.tflite and it runs perfectly on my board with the
expected output.
When I try to run mobilenet_v1_0.5_128.tflite (or any other model) I get
this error:
Traceback (most recent call last):
File "test_mobilenet.py", line 118, in <module>
module.get_graph_json(), session.get_system_lib(), session.device
File "/home/vagrant/uTVM/python/tvm/micro/session.py", line 214, in
create_local_graph_executor
fcreate(graph_json_str, mod, lookup_remote_linked_param, *device_type_id)
File "/home/vagrant/uTVM/python/tvm/_ffi/_ctypes/packed_func.py", line
237, in __call__
raise get_last_ffi_error()
tvm.error.RPCError: Traceback (most recent call last):
13: TVMFuncCall
12: _ZNSt17_Function_handlerI
11: tvm::runtime::{lambda(tvm::runtime::TVMArgs,
tvm::runtime::TVMRetValue*)#1}::operator()(tvm::runtime::TVMArgs,
tvm::runtime::TVMRetValue*) const [clone .isra.765]
10: tvm::runtime::GraphExecutorCreate(std::__cxx11::basic_string<char,
std::char_traits<char>, std::allocator<char> > const&, tvm::runtime::Module
const&, std::vector<DLDevice, std::allocator<DLDevice> > const&,
tvm::runtime::PackedFunc)
9: tvm::runtime::GraphExecutor::Init(std::__cxx11::basic_string<char,
std::char_traits<char>, std::allocator<char> > const&, tvm::runtime::Module,
std::vector<DLDevice, std::allocator<DLDevice> > const&,
tvm::runtime::PackedFunc)
8: tvm::runtime::GraphExecutor::SetupStorage()
7: tvm::runtime::NDArray::Empty(tvm::runtime::ShapeTuple, DLDataType,
DLDevice, tvm::runtime::Optional<tvm::runtime::String>)
6: tvm::runtime::RPCDeviceAPI::AllocDataSpace(DLDevice, int, long const*,
DLDataType, tvm::runtime::Optional<tvm::runtime::String>)
5: tvm::runtime::RPCClientSession::AllocDataSpace(DLDevice, int, long
const*, DLDataType, tvm::runtime::Optional<tvm::runtime::String>)
4: std::_Function_handler<void (tvm::runtime::TVMArgs,
tvm::runtime::TVMRetValue*),
tvm::runtime::RPCEndpoint::Init()::{lambda(tvm::runtime::TVMArgs,
tvm::runtime::TVMRetValue*)#2}>::_M_invoke(std::_Any_data const&,
tvm::runtime::TVMArgs&&, tvm::runtime::TVMRetValue*&&)
3: tvm::runtime::RPCEndpoint::HandleUntilReturnEvent(bool,
std::function<void (tvm::runtime::TVMArgs)>)
2: tvm::runtime::RPCEndpoint::EventHandler::HandleNextEvent(bool, bool,
std::function<void (tvm::runtime::TVMArgs)>)
1:
tvm::runtime::RPCEndpoint::EventHandler::HandleProcessPacket(std::function<void
(tvm::runtime::TVMArgs)>)
0:
tvm::runtime::RPCEndpoint::EventHandler::HandleReturn(tvm::runtime::RPCCode,
std::function<void (tvm::runtime::TVMArgs)>)
File "/home/vagrant/uTVM/src/runtime/rpc/rpc_endpoint.cc", line 376
RPCError: Error caught from RPC call:
This is the entire output (the "RPCError: Error caught from RPC call:"
displays nothing). Also, the error appears in function
"tvm.micro.create_local_graph_executor"
[micro_tvm_issue.zip](https://github.com/apache/tvm/files/7459101/micro_tvm_issue.zip)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]