zhiics commented on a change in pull request #4241: [Dynamic Shape] Add Graph
Dispatcher
URL: https://github.com/apache/incubator-tvm/pull/4241#discussion_r362157429
##########
File path: python/tvm/relay/transform.py
##########
@@ -1032,3 +1032,40 @@ def visit_var(self, var):
else:
return var
return ChangeBatchMutator().visit(func)
+
+def add_dispatch_func(mod, func_name, input_shape, dispatch_func):
+ """Dispatch a global function in module with symbolic input shape.
+
+ Parameters
+ ----------
+ mod: tvm.relay.Module
+ Module object contains global function.
+
+ func_name: str
+ Name of global function.
+
+ input_shape: dict from str to tuple of relay.Expr or int.
+ Input shapes dictionary from input name to tuple.
+
+ dispatch_func: Function
+ Function for dispatching logic.
+
+ The input argument is input shape dictionary. Return value
+ is a dict from input name to a dict from symbolic axis index to
+ list of intervals.
+
+ For example, for input shape {"data": (1, 3, h, w)}, the return
+ value should look like:
+ {
+ "data": {
+ 2: [(1, 9), (9, 17), (17, 25), (25, 33), ...]
+ 3: [(1, 17), (17, 33), (33, 49), ...]
+ }
+ {
+
+ Returns
+ -------
+ result: tvm.relay.Module
+ Module with updated global function.
+ """
+ return _transform.add_dispatch_func(mod, func_name, input_shape,
dispatch_func)
Review comment:
It seems that we can directly pass a string or enum and default it to a more
commonly used one, e.g. `dispatch_strategy = uniform/exp` instead of passing a
PackedFunc, right? By doing this we can let c++ to handle the dispatching
strategy and logic is transparent to users.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services