lixiaoquan commented on a change in pull request #6008:
URL: https://github.com/apache/incubator-tvm/pull/6008#discussion_r451997081



##########
File path: python/tvm/relay/op/algorithm.py
##########
@@ -82,9 +84,12 @@ def topk(data, k=1, axis=-1, ret_type="both",
     out : relay.Expr or List[relay.Expr]
         The computed result.
     """
-    if isinstance(k, int):
-        k = const(k, "int64")
-    out = _make.topk(data, k, axis, ret_type, is_ascend, dtype)
+    if isinstance(k, Constant):
+        k = np.asscalar(k.data.asnumpy())
+    if isinstance(k, Expr):

Review comment:
       > Current dyn namespace is for attrs to be Relay Expr which makes shape 
func data dependent. For data independent shape func we can still relay on 
"static" version op.
   
   I understand that. But it's possbile that an op is input shape depandeant 
but input shape itself is dynamic. In this kind case, we still need to use 
shape function.
   
   This is a simliar case which doesn't have topk, but has similar issue.
   ```
   v = var("v", int32)
   t0 = arange(0, v, 1)   # output an dynamic shape tensor
   t1 = strided_slice(t0, [0], [-1], [1])  # output shape depends on a dynamic 
input shape, even if attrs are static
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to