ganler opened a new issue #10528:
URL: https://github.com/apache/tvm/issues/10528


   Thanks for participating in the TVM community! We use https://discuss.tvm.ai 
for any general usage questions and discussions. The issue tracker is used for 
actionable items such as feature proposals discussion, roadmaps, and bug 
tracking.  You are always welcomed to post on the forum first :smile_cat:
   
   Issues that are inactive for a period of time may get closed. We adopt this 
policy so that we won't lose track of actionable issues that may fall at the 
bottom of the pile. Feel free to reopen a new one if you feel there is an 
additional problem that needs attention when an old one gets closed.
   
   ### Expected behavior
   
   ```python
   """
   def @main(%x: Tensor[(2, 2, 1, 1), float32]) -> Tensor[(2, 3, 4), float32] {
     %0 = nn.conv2d(%x, meta[relay.Constant][0] /* ty=Tensor[(1, 2, 3, 1), 
float32] */, strides=[2, 2], padding=[3, 3, 3, 3]) /* ty=Tensor[(2, 1, 3, 4), 
float32] */;
     squeeze(%0, axis=[1]) /* ty=Tensor[(2, 3, 4), float32] */
   }
   """
   ```
   
   This simple conv2d + squeeze model should pass compilation but actually 
failed.
   
   ### Actual behavior
   
   ```
     5: tvm::relay::MixedModeMutator::VisitLeaf(tvm::RelayExpr const&)
           at /home/ganler/Documents/tvm/src/relay/ir/expr_functor.cc:81
     4: tvm::relay::TempRealizer::DispatchVisitExpr(tvm::RelayExpr const&)
           at 
/home/ganler/Documents/tvm/src/relay/transforms/forward_rewrite.cc:46
     3: 
tvm::relay::LayoutAlternatedExprNode<tvm::relay::alter_op_layout::AlterTransformMemorizer>::Realize()
 const
           at 
/home/ganler/Documents/tvm/src/relay/transforms/transform_layout.h:183
     2: tvm::relay::TransformMemorizer::Transform(tvm::RelayExpr, 
tvm::tir::Layout const&, tvm::tir::Layout const&)
           at 
/home/ganler/Documents/tvm/src/relay/transforms/transform_layout.h:115
     1: tvm::relay::TransformMemorizer::TransformHelper(tvm::RelayExpr, 
tvm::tir::Layout, tvm::tir::Layout)
           at 
/home/ganler/Documents/tvm/src/relay/transforms/transform_layout.h:157
     0: tvm::tir::BijectiveLayout::BijectiveLayout(tvm::tir::Layout, 
tvm::tir::Layout)
           at /home/ganler/Documents/tvm/src/tir/ir/data_layout.cc:421
     File "/home/ganler/Documents/tvm/src/tir/ir/data_layout.cc", line 422
   TVMError: 
   ---------------------------------------------------------------
   An error occurred during the execution of TVM.
   For more information, please see: https://tvm.apache.org/docs/errors.html
   ---------------------------------------------------------------
     Check failed: (GetStoreRule(&n->index_backward_rule, 
&n->shape_backward_rule, n->dst_layout, n->src_layout)) is false: NHW1c  NHW
   ```
   
   ### Environment
   
   Linux; Clang-14; TVM tag 8f6fa8f2c41406cb54d01647ba8731e4ceb8f4ab;
   
   ### Steps to reproduce
   
   Preferably a minimal script to cause the issue to occur.
   
   ```python
   import tvm
   from tvm import relay
   import numpy as np
   xshape = (2, 2, 1, 1)
   inp = np.random.uniform(size=xshape).astype(np.int64)
   
   x = relay.var("x", shape=xshape, dtype='float32')
   
   v1 = relay.nn.conv2d(x, weight=relay.const(value=np.random.random((1, 2, 3, 
1))), strides=[2, 2], padding=[3, 3, 3, 3], kernel_size=[3, 1], channels=1)
   out = relay.squeeze(v1, axis=[1])
   
   func = relay.Function([x], out)
   mod = tvm.IRModule.from_expr(func)
   
   with tvm.transform.PassContext(opt_level=4):
       relay.build_module.create_executor("graph", mod, tvm.cpu(), 
target='llvm').evaluate()
   ```
   
   This seems to be related to https://github.com/apache/tvm/pull/9996
   
   cc: @masahi @yangulei
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to