kevinthesun commented on a change in pull request #5699:
URL: https://github.com/apache/incubator-tvm/pull/5699#discussion_r435748804
##########
File path: python/tvm/relay/frontend/tensorflow.py
##########
@@ -3194,6 +3191,55 @@ def _convert_operator(self, op_name, inputs, attrs,
raise NotImplementedError("Operator {} not
implemented.".format(op_name))
return sym
+ def _licm_construct(self, loop_name, node_name):
+ """Construct a node by considering whether it is
+ loop invariant with the given while loop. If yes, we
+ generate a loop Variable. Otherwise, return regular
+ converted relay expression.
+
+ Parameters
+ ----------
+ loop_name : str
+ TensorFlow while loop name to be checked.
+
+ node_name : str
+ TensorFlow node name.
+
+ Returns
+ -------
+ out : relay.Expr or relay.Var
+ Converted relay expression or loop var.
+ """
+ actual_expr = self._backtrack_construct(node_name)
+ tn = node_name.split(':').split("^")[-1]
+ node_name = tn[0]
+ cloop_name = find_parent_loop_name(node_name,
self._while_loop_name_set)
+
+ if loop_name in self._while_loop_name_set and not
cloop_name.startswith(loop_name):
Review comment:
Indeed when user sets tf op name in the way that the graph def node name
is in the format of ```loop_name/xxx```, we can't know whether it belongs to a
while loop or not. The problem here is tf op name is a part of node name in
graph def and there is no ```name``` attribute in node attr. For now I haven't
found a better way to do licm node construction. In practice, this case should
be rare since while loop name is a complicated hierarchical combination of op
and sub-graph names.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]