tkonolige commented on a change in pull request #7308:
URL: https://github.com/apache/tvm/pull/7308#discussion_r563847488
##########
File path: python/tvm/topi/cuda/sparse.py
##########
@@ -295,7 +295,14 @@ def is_valid_for_sparse_dense_padded(data, weight_data):
"""
# pylint:disable=invalid-name
warp_size =
int(tvm.target.Target.current(allow_none=False).thread_warp_size)
- m = get_const_tuple(data.checked_type.shape)[1]
+ # If there are multiple alter_ops in a model, the first alteration does not
+ # run type inference for the subsequent ones. In this case, we don't have
+ # the shape information, so we run the inferencer manually.
+ try:
Review comment:
@ANSHUMAN87, yes, as I said, it throws an exception.
```
File "tvm/_ffi/_cython/./packed_func.pxi", line 56, in
tvm._ffi._cy3.core.tvm_callback
File "/home/tristan/octoml/tvm/python/tvm/relay/op/nn/_nn.py", line 89, in
alter_op_layout_sparse_dense
return topi.nn.sparse_dense_alter_layout(attrs, inputs, tinfos, out_type)
File "<decorator-gen-73>", line 2, in sparse_dense_alter_layout
File "/home/tristan/octoml/tvm/python/tvm/target/generic_func.py", line
275, in dispatch_func
return dispatch_dict[k](*args, **kwargs)
File "/home/tristan/octoml/tvm/python/tvm/topi/cuda/sparse.py", line 408,
in _alter_sparse_dense_layout
and is_valid_for_sparse_dense_padded(inputs[0], inputs[1].data.asnumpy())
File "/home/tristan/octoml/tvm/python/tvm/topi/cuda/sparse.py", line 301,
in is_valid_for_sparse_dense_padded
if hasattr(data, "checked_type"):
File "/home/tristan/octoml/tvm/python/tvm/ir/expr.py", line 50, in
checked_type
raise ValueError("The type checker has not populated" " the checked_type
for this node")
ValueError: The type checker has not populated the checked_type for this node
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]