alimagic opened a new issue, #13062:
URL: https://github.com/apache/tvm/issues/13062

   In python code, use some List[torch.Tensor] = [], then the mode convert from 
torch  to onnx, show in graph using SequenceEmpty. And now,  use code as follow 
in xmir/python/tvm/relay/frontend/onnx.py: 
   
   class SequenceEmpty(OnnxOpConverter):
       """Operator converter for sequence construction op."""
       @classmethod
       def _impl_v11(cls, inputs, attr, params):
           # Construct a tuple from input tensors.
           # const0 = _expr.const(0, dtype="int64")
           return _expr.Tuple([])
   
    However, error occur at"get_const_tuple", as follow:
   
    def get_var(name, val, scan=False):
               checked_type = infer_type(val)
               if hasattr(checked_type, "type_annotation"):
                   checked_type = checked_type.type_annotation
               if hasattr(checked_type, "checked_type"):
                   checked_type = checked_type.checked_type
               shape = get_const_tuple(checked_type.shape)
   
   I print the "val" and "checked_type" and all is  show "()". So,has anyone 
encountered a similar situation?
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to