comaniac opened a new issue #7090:
URL: https://github.com/apache/tvm/issues/7090
## Description
When compiling a Conv2D with non-square input image, the weight transform
has some bugs so that the transformed weight has the wrong shape. For example,
the shape I used (1, 23, 40, 128) results in alpha=4 of the transform weight.
Based on this value, the kernel size is `alpha - tile_size + 1 = 4 - 4 + 1 =
1`, which is no longer 3.
## Error Log
```
Exception in thread Thread-1:
Traceback (most recent call last):
File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/usr/lib/python3.6/threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File
"/home/ubuntu/cody-tvm/python/tvm/auto_scheduler/relay_integration.py", line
61, in call_all_topi_funcs
grc.codegen(opt_mod["main"])
File
"/home/ubuntu/cody-tvm/python/tvm/relay/backend/graph_runtime_codegen.py", line
83, in codegen
self._codegen(func)
File "tvm/_ffi/_cython/./packed_func.pxi", line 321, in
tvm._ffi._cy3.core.PackedFuncBase.__call__
File "tvm/_ffi/_cython/./packed_func.pxi", line 256, in
tvm._ffi._cy3.core.FuncCall
File "tvm/_ffi/_cython/./packed_func.pxi", line 245, in
tvm._ffi._cy3.core.FuncCall3
File "tvm/_ffi/_cython/./base.pxi", line 160, in tvm._ffi._cy3.core.CALL
tvm._ffi.base.TVMError: Traceback (most recent call last):
[bt] (8)
/home/ubuntu/cody-tvm/build/libtvm.so(tvm::relay::backend::MemoizedExprTranslator<tvm::runtime::Array<tvm::te::Tensor,
void> >::VisitExpr(tvm::RelayExpr const&)+0xa9) [0x7fce58f91e69]
[bt] (7)
/home/ubuntu/cody-tvm/build/libtvm.so(tvm::relay::ExprFunctor<tvm::runtime::Array<tvm::te::Tensor,
void> (tvm::RelayExpr const&)>::VisitExpr(tvm::RelayExpr const&)+0x82)
[0x7fce58f91be2]
[bt] (6)
/home/ubuntu/cody-tvm/build/libtvm.so(tvm::relay::ExprFunctor<tvm::runtime::Array<tvm::te::Tensor,
void> (tvm::RelayExpr const&)>::InitVTable()::{lambda(tvm::runtime::ObjectRef
const&, tvm::relay::ExprFunctor<tvm::runtime::Array<tvm::te::Tensor, void>
(tvm::RelayExpr const&)>*)#6}::_FUN(tvm::runtime::ObjectRef const&,
tvm::relay::ExprFunctor<tvm::runtime::Array<tvm::te::Tensor, void>
(tvm::RelayExpr const&)>*)+0x27) [0x7fce58f832e7]
[bt] (5)
/home/ubuntu/cody-tvm/build/libtvm.so(tvm::relay::ScheduleGetter::VisitExpr_(tvm::relay::CallNode
const*)+0x14f) [0x7fce58f89a8f]
[bt] (4)
/home/ubuntu/cody-tvm/build/libtvm.so(tvm::relay::backend::MemoizedExprTranslator<tvm::runtime::Array<tvm::te::Tensor,
void> >::VisitExpr(tvm::RelayExpr const&)+0xa9) [0x7fce58f91e69]
[bt] (3)
/home/ubuntu/cody-tvm/build/libtvm.so(tvm::relay::ExprFunctor<tvm::runtime::Array<tvm::te::Tensor,
void> (tvm::RelayExpr const&)>::VisitExpr(tvm::RelayExpr const&)+0x82)
[0x7fce58f91be2]
[bt] (2)
/home/ubuntu/cody-tvm/build/libtvm.so(tvm::relay::ExprFunctor<tvm::runtime::Array<tvm::te::Tensor,
void> (tvm::RelayExpr const&)>::InitVTable()::{lambda(tvm::runtime::ObjectRef
const&, tvm::relay::ExprFunctor<tvm::runtime::Array<tvm::te::Tensor, void>
(tvm::RelayExpr const&)>*)#6}::_FUN(tvm::runtime::ObjectRef const&,
tvm::relay::ExprFunctor<tvm::runtime::Array<tvm::te::Tensor, void>
(tvm::RelayExpr const&)>*)+0x27) [0x7fce58f832e7]
[bt] (1)
/home/ubuntu/cody-tvm/build/libtvm.so(tvm::relay::ScheduleGetter::VisitExpr_(tvm::relay::CallNode
const*)+0x714) [0x7fce58f8a054]
[bt] (0) /home/ubuntu/cody-tvm/build/libtvm.so(+0x1230f0b) [0x7fce5913cf0b]
File "tvm/_ffi/_cython/./packed_func.pxi", line 55, in
tvm._ffi._cy3.core.tvm_callback
File "/home/ubuntu/cody-tvm/python/tvm/relay/backend/compile_engine.py",
line 300, in lower_call
best_impl, outputs = select_implementation(op, call.attrs, inputs,
ret_type, target)
File "/home/ubuntu/cody-tvm/python/tvm/relay/backend/compile_engine.py",
line 205, in select_implementation
outs = best_plevel_impl.compute(attrs, inputs, out_type)
File "/home/ubuntu/cody-tvm/python/tvm/relay/op/op.py", line 90, in compute
return _OpImplementationCompute(self, attrs, inputs, out_type)
File "tvm/_ffi/_cython/./packed_func.pxi", line 321, in
tvm._ffi._cy3.core.PackedFuncBase.__call__
File "tvm/_ffi/_cython/./packed_func.pxi", line 266, in
tvm._ffi._cy3.core.FuncCall
File "tvm/_ffi/_cython/./base.pxi", line 160, in tvm._ffi._cy3.core.CALL
[bt] (3) /home/ubuntu/cody-tvm/build/libtvm.so(TVMFuncCall+0x65)
[0x7fce59140485]
[bt] (2) /home/ubuntu/cody-tvm/build/libtvm.so(+0x1140ca8) [0x7fce5904cca8]
[bt] (1)
/home/ubuntu/cody-tvm/build/libtvm.so(tvm::relay::OpImplementation::Compute(tvm::Attrs
const&, tvm::runtime::Array<tvm::te::Tensor, void> const&, tvm::Type
const&)+0xb1) [0x7fce5904ca71]
[bt] (0) /home/ubuntu/cody-tvm/build/libtvm.so(+0x1230f0b) [0x7fce5913cf0b]
File "tvm/_ffi/_cython/./packed_func.pxi", line 55, in
tvm._ffi._cy3.core.tvm_callback
File "/home/ubuntu/cody-tvm/python/tvm/relay/op/strategy/generic.py", line
214, in _compute_conv2d
return [topi_compute(*args)]
File "/home/ubuntu/cody-tvm/python/tvm/topi/nn/conv2d.py", line 1191, in
conv2d_winograd_nhwc_without_weight_transform
data, weight, strides, padding, dilation, out_dtype, pre_computed=True
File "<decorator-gen-57>", line 2, in conv2d_winograd_nhwc
File "/home/ubuntu/cody-tvm/python/tvm/target/generic_func.py", line 275,
in dispatch_func
return dispatch_dict[k](*args, **kwargs)
File "/home/ubuntu/cody-tvm/python/tvm/topi/cuda/conv2d_winograd.py", line
373, in conv2d_winograd_nhwc_cuda
data, weight, strides, padding, dilation, out_dtype, tile_size,
pre_computed
File "/home/ubuntu/cody-tvm/python/tvm/topi/nn/conv2d.py", line 1036, in
_conv2d_winograd_nhwc_impl
assert HSTR == 1 and WSTR == 1 and KH == 3 and KW == 3
TVMError: AssertionError
Get devices for measurement successfully!
Traceback (most recent call last):
File "tests/python/relay/test_auto_scheduler_tuning.py", line 68, in
<module>
test_tuning_cuda()
File "tests/python/relay/test_auto_scheduler_tuning.py", line 64, in
test_tuning_cuda
tune_network("winograd-test", "cuda")
File "tests/python/relay/test_auto_scheduler_tuning.py", line 37, in
tune_network
tuner = auto_scheduler.TaskScheduler(tasks, task_weights)
File "/home/ubuntu/cody-tvm/python/tvm/auto_scheduler/task_scheduler.py",
line 234, in __init__
assert len(self.tasks) != 0, "No tasks"
AssertionError: No tasks
```
## How to reproduce
1. In
`tests/python/relay/test_auto_scheduler_task_extraction.py:get_network`:
```python
elif name == "winograd-test":
input_shape = [1, 23, 40, 128] # <-- modify
data = relay.var("data", shape=input_shape, dtype="float32")
net = relay.testing.layers.conv2d(
data=data,
channels=128, # <-- modify
kernel_size=3,
strides=1,
padding=1,
data_layout="NHWC",
kernel_layout="HWIO",
name="",
)
```
2. Run `python tests/python/relay/test_auto_scheduler_tuning.py`.
cc @merrymercy @FrozenGene
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]