amai-gsu commented on PR #4698:
URL: https://github.com/apache/tvm/pull/4698#issuecomment-1800699964

   > @tmoreau89 I have tried the setup with the same versions of tvm/tensorflow 
on the host and the board and the "cpu" part of the inference works fine. But 
when I set the target to edge_tpu, I get this error on the rpc server
   > 
   > ```
   > ERROR: Internal: Unsupported data type: 0
   > ERROR: Node number 0 (edgetpu-custom-op) failed to prepare
   > ```
   > 
   > And on the host machine, it says
   > 
   > ```
   >  File "tvm_inference.py", line 21, in <module>
   >     runtime = tflite_runtime.create(f.read(), ctx, runtime_target=target)
   > 
   >   File 
"/home/sabih/Documents/phd_work/MAP_WORK/tvm_env/tvm/python/tvm/contrib/tflite_runtime.py",
 line 49, in create
   >     return TFLiteModule(fcreate(bytearray(tflite_model_bytes), ctx))
   > 
   >   File 
"/home/sabih/Documents/phd_work/MAP_WORK/tvm_env/tvm/python/tvm/_ffi/_ctypes/function.py",
 line 207, in __call__
   >     raise get_last_ffi_error()
   > 
   > tvm._ffi.base.TVMError: Traceback (most recent call last):
   >   [bt] (3) /tvm_env/tvm/build/libtvm.so(TVMFuncCall+0x69) [0x7f2fb63f8489]
   >   [bt] (2) /tvm_env/tvm/build/libtvm.so(std::_Function_handler<void 
(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*), 
tvm::runtime::RPCModuleNode::WrapRemote(void*)::{lambda(tvm::runtime::TVMArgs, 
tvm::runtime::TVMRetValue*)#1}>::_M_invoke(std::_Any_data const&, 
tvm::runtime::TVMArgs&&, tvm::runtime::TVMRetValue*&&)+0x46) [0x7f2fb644ad36]
   >   [bt] (1) 
/tvm_env/tvm/build/libtvm.so(tvm::runtime::RPCSession::CallFunc(void*, 
tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*, void* (*)(int, 
tvm::runtime::TVMArgValue const&), tvm::runtime::PackedFunc const*)+0x2c8) 
[0x7f2fb6454168]
   >   [bt] (0) /tvm_env/tvm/build/libtvm.so(+0xc21d6b) [0x7f2fb6450d6b]
   >   File "/tvm_env/tvm/src/runtime/rpc/rpc_session.cc", line 993
   > TVMError: Check failed: code == RPCCode: :kReturn: code=4
   > ```
   > 
   > The inference directly on the edge TPU works fine.
   have you solved this issue? i got a same one.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to