mseth10 opened a new issue #18823:
URL: https://github.com/apache/incubator-mxnet/issues/18823
## Description
When a computation graph is partitioned, ops are grouped into subgraphs
based on the subgraph property. For CachedOp subgraphs containing reshape
and/or transpose op, the backward pass fails.
### Error Message
(Paste the complete error message. Please also include stack trace by
setting environment variable `DMLC_LOG_STACK_TRACE_DEPTH=10` before running
your script.)
## To Reproduce
```
import mxnet as mx
from mxnet.gluon import HybridBlock
from mxnet.base import check_call, _LIB, c_str, mx_uint, c_str_array
class _TestBlock(HybridBlock):
def __init__(self):
super(_TestBlock, self).__init__()
def hybrid_forward(self, F, data):
return F.reshape(data + data, (-1,)).argmax(0)
if __name__=='__main__':
block = _TestBlock()
subgraph_backend = 'default'
op_names = ['elemwise_add', 'Reshape', 'argmax']
check_call(_LIB.MXSetSubgraphPropertyOpNamesV2(c_str(subgraph_backend),
mx_uint(len(op_names)),
c_str_array(op_names)))
block.hybridize(backend=subgraph_backend)
data.attach_grad()
with mx.autograd.record():
result = block(data)
result.backward()
```
### Steps to reproduce
(Paste the commands you ran that produced the error.)
1.
2.
## What have you tried to solve it?
1.
2.
## Environment
We recommend using our script for collecting the diagnositc information. Run
the following command and paste the outputs below:
```
curl --retry 10 -s
https://raw.githubusercontent.com/dmlc/gluon-nlp/master/tools/diagnose.py |
python
# paste outputs here
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]