taomiao opened a new issue, #16399:
URL: https://github.com/apache/tvm/issues/16399

   Thanks for participating in the TVM community! We use https://discuss.tvm.ai 
for any general usage questions and discussions. The issue tracker is used for 
actionable items such as feature proposals discussion, roadmaps, and bug 
tracking.  You are always welcomed to post on the forum first :smile_cat:
   
   Issues that are inactive for a period of time may get closed. We adopt this 
policy so that we won't lose track of actionable issues that may fall at the 
bottom of the pile. Feel free to reopen a new one if you feel there is an 
additional problem that needs attention when an old one gets closed.
   
   ### Expected behavior
   
   work well
   
   ### Actual behavior
   
   ```
   self = <tvm.relay.frontend.pytorch.PyTorchOpConverter object at 
0x7f090d88aa40>
   op_names = {'aten::logical_or'}
   
       def report_missing_conversion(self, op_names):
           """Check if all ops in an input graph are supported by TVM"""
           known_ops = [
               "prim::Constant",
               "prim::GetAttr",
               "prim::ListConstruct",
               "prim::ListUnpack",
               "prim::TupleConstruct",
               "prim::TupleUnpack",
               "prim::RaiseException",
               "prim::If",
               "prim::Loop",
           ]
           known_ops += list(self.convert_map.keys())
           known_ops += list(qnn_torch.convert_map.keys())
       
           missing = []
       
           for op_name in op_names:
               # Also take care of in-place variant ops like aten::relu_
               if op_name not in known_ops and not (
                   op_name.endswith("_") and op_name[:-1] in known_ops
               ):
                   missing.append(op_name)
       
           if missing:
               msg = f"The following operators are not implemented: {missing}"
   >           raise NotImplementedError(msg)
   E           NotImplementedError: The following operators are not 
implemented: ['aten::logical_or']
   
   ../../../../python/tvm/relay/frontend/pytorch.py:4330: NotImplementedError
   ```
   
   ### Environment
   
   os: ubuntu
   python: 3.9
   pytorch: 2.0
   tvm: main branch
   
   ### Steps to reproduce
   
   ```
   def test_logical_or():
       """test_logical_or"""
   
       def test_fn(x, y):
           return torch.logical_or(x, y)
   
       a = torch.tensor([0, 1, 10, 0], dtype=torch.int8)
       b = torch.tensor([4, 0, 1, 0], dtype=torch.int8)
       verify_model(test_fn, [a, b])
   
       a = torch.tensor([True, False, True])
       b = torch.tensor([True, False, False])
       verify_model(test_fn, [a, b])
   ```
   
   ### Triage
   
   Please refer to the list of label tags 
[here](https://github.com/apache/tvm/wiki/Issue-Triage-Labels) to find the 
relevant tags and add them below in a bullet format (example below).
   
   * needs-triage
   * frontend:pytorch
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to