altanh commented on a change in pull request #7323:
URL: https://github.com/apache/tvm/pull/7323#discussion_r564000478



##########
File path: python/tvm/relay/op/_tensor_grad.py
##########
@@ -357,16 +357,24 @@ def global_avg_pool2d_grad(orig, grad):
     return [pool_grad]
 
 
-# not implemented, this is only for testing.
 @register_gradient("concatenate")
 def concatenate_grad(orig, grad):
+    """
+    Returns the gradient of concatenate, which is just the downstream gradient
+    split across the inputs.
+    """
     assert len(orig.args) == 1
     t = orig.args[0]
-    x = TupleGetItem(t, 0)
-    y = TupleGetItem(t, 1)
-    # Assume only two element in tuple rn.
-    # In the real implementation, concatenate_grad probably need to be 
implemented by an operator.
-    return [Tuple([zeros_like(x), zeros_like(y)])]
+
+    # calculate split indices. TODO(@altanh): support Any?

Review comment:
       I can open an issue for this but more generally a lot of the gradients 
(and even inference compute functions) assume concrete shapes. It may be worth 
creating an Any support tracking page on a per-operator basis if we want 
something centralized




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to