milad1378yz opened a new issue, #11008:
URL: https://github.com/apache/tvm/issues/11008

   I used two ways to use TVM on the BERT model
   The first is to use the provider in onnxruntime and the second is to use the 
following link:
   [https://tvm.apache.org/docs/how_to/deploy_models/deploy_sparse.html](url)
   In the first method, I encountered the following error:
   Incompatible broadcast type TensorType([-1, -1, 768], float32) and 
TensorType([1, 511, 768], float32)
   This means that dynamic input cannot be given to the model.
   In the second method, in the Configure Settings section, the batch_size and 
len_size values must be specified, which contradicts the dynamic input.
   My question is, is there a way to use TVM for dynamic models like BERT?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to