ANSHUMAN87 commented on pull request #7656:
URL: https://github.com/apache/tvm/pull/7656#issuecomment-798933062
> It seems to me that the current pattern already covers this case? You can
refer to the test cases I added, which have a ReLU followed by transposes. Is
this what you want?
Thanks @comaniac for clarification. It is indeed supported.
Can we add 1 more test case as below, it will help reviewer to understand
the flow is supported:
```
def before4():
x = relay.var("x", shape=(1, 3, 224, 224), dtype="float32") # NCHW
y = relay.nn.relu(x)
#y = relay.transpose(y, axes=[0, 2, 3, 1])
y = relay.transpose(y) # Reverse
y = relay.transpose(y) # Reverse
y = relay.transpose(y, axes=[0, 2, 3, 1])
y = relay.transpose(y) # Reverse
y = relay.transpose(y) # Reverse
return relay.Function([x], y)
def expected4():
x = relay.var("x", shape=(1, 3, 224, 224), dtype="float32") # NCHW
y = relay.nn.relu(x)
y = relay.transpose(y, axes=[0, 2, 3, 1])
return relay.Function([x], y)
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]