masahi edited a comment on issue #6300:
URL: https://github.com/apache/incubator-tvm/issues/6300#issuecomment-676837709


   ok found the problem. The node_offset is indeed giving an warning at
   
   
https://github.com/apache/incubator-tvm/blob/939a42b4e976a41e8513b720421d3c3678493715/python/tvm/relay/frontend/pytorch.py#L2146-L2150
   
   Since `default_type` is float32, from that point TVM thinks `node_offset` is 
float32 tensor. That results in unnecessary cast.
   
   This is the same problem I mentioned earlier about conv weight, bias etc. 
But since they are float32 anyway, the default type we picked for them doesn't 
do any harm. Your use case is the first one we met where one of parameters to 
your model is indeed an integer tensor.
   
   Even though the dtype of `nodes_offset` tensor is int64, when we traverse a 
TorchScript graph and get the input type of rhs of `add` node via 
`inputs[1].type().scalarType()`, it returns None.  Surprisingly, doing explicit 
`self.nodes_offset.long()` in the Torch module  does make `scalarType()` of 
`node_offset` above int64. Note that this is a PyTorch problem, so I don't 
consider this a TVM bug.
   
   I'll look into a bit more on what's going on with typing of parameter 
tensors at the Torchscript level. We might end up solving "Untyped warning" 
problem of parameters.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to