comaniac commented on a change in pull request #7845:
URL: https://github.com/apache/tvm/pull/7845#discussion_r612991959
##########
File path: tests/python/frontend/pytorch/test_forward.py
##########
@@ -219,6 +219,21 @@ def verify_model(model_name, input_data=[],
custom_convert_map={}, rtol=1e-5, at
assert_shapes_match(baseline_output, compiled_output)
tvm.testing.assert_allclose(baseline_output, compiled_output,
rtol=rtol, atol=atol)
+
+ if len(expected_ops) != 0:
Review comment:
```suggestion
if expected_ops:
```
##########
File path: tests/python/frontend/pytorch/test_forward.py
##########
@@ -219,6 +219,21 @@ def verify_model(model_name, input_data=[],
custom_convert_map={}, rtol=1e-5, at
assert_shapes_match(baseline_output, compiled_output)
tvm.testing.assert_allclose(baseline_output, compiled_output,
rtol=rtol, atol=atol)
+
+ if len(expected_ops) != 0:
+ found_op = dict.fromkeys(expected_ops, False)
+ def visit(op):
+ if isinstance(op, tvm.ir.op.Op):
+ if op.name in expected_ops:
+ found_op[op.name] = True
+
+ tvm.relay.analysis.post_order_visit(mod['main'].body, visit)
+
+ for op_name, is_found in enumerate(found_op):
+ if not is_found:
+ msg = "TVM Relay do not contain expected op [{}]"
+ raise AssertionError(msg.format(op_name))
Review comment:
Better to collect all not found ops and throw them together.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]