icemelon9 commented on a change in pull request #5826:
URL: https://github.com/apache/incubator-tvm/pull/5826#discussion_r448021317



##########
File path: src/relay/analysis/util.cc
##########
@@ -448,13 +448,8 @@ bool IsDataDependant(const CallNode* call) {
     return false;
   }
 
-  if (op->name == "reshape") {
-    if (const auto* attrs = call->attrs.as<ReshapeAttrs>()) {
-      if (attrs->newshape) {
-        // If newshape attribute exists, it isn't data dependant.
-        return false;
-      }
-    }
+  if (op->name == "dyn.reshape") {
+    return true;

Review comment:
       We can remove line 451-452 as the shape function of `dyn.reshape` is 
registered to be data dependent.

##########
File path: python/tvm/relay/op/_tensor_grad.py
##########
@@ -511,7 +511,7 @@ def batch_matmul_grad(orig, grad):
 @register_gradient("reshape")
 def reshape_grad(orig, grad):
     """Gradient of reshape"""
-    return [reshape_like(grad, orig.args[0]), orig.args[1]]
+    return [reshape_like(grad, orig.args[0])]

Review comment:
       Do we need to register a gradient op to `dyn.reshape`?

##########
File path: src/relay/op/tensor/transform.h
##########
@@ -38,7 +38,7 @@
 namespace tvm {
 namespace relay {
 
-extern Expr MakeReshape(Expr data, Expr newshape);
+extern Expr MakeReshape(Expr data, Array<Integer> newshape);

Review comment:
       ```suggestion
   Expr MakeReshape(Expr data, Array<Integer> newshape);
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to