[GitHub] [incubator-tvm] lixiaoquan commented on a change in pull request #5429: [RELAY][TF] Support symbolic newshape for Reshape

2020-05-11 Thread GitBox


lixiaoquan commented on a change in pull request #5429:
URL: https://github.com/apache/incubator-tvm/pull/5429#discussion_r422950273



##
File path: python/tvm/relay/op/_transform.py
##
@@ -189,10 +192,9 @@ def _reshape_shape_func(data_shape, newshape, ndim):
 out[infer_idx] = old_size // new_size
 return out
 
-@_reg.register_shape_func("reshape", False)
+@_reg.register_shape_func("reshape", True)

Review comment:
   @icemelon9  I've solved your concern. Could you please take another 
look? Thanks





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [incubator-tvm] lixiaoquan commented on a change in pull request #5429: [RELAY][TF] Support symbolic newshape for Reshape

2020-05-05 Thread GitBox


lixiaoquan commented on a change in pull request #5429:
URL: https://github.com/apache/incubator-tvm/pull/5429#discussion_r420541474



##
File path: tests/python/relay/test_any.py
##
@@ -138,23 +138,35 @@ def test_any_concat():
 result = ex.evaluate()(x_np, y_np)
 tvm.testing.assert_allclose(result.asnumpy(), ref)
 
-def verify_any_reshape(x_shape, newshape, x_np_shape, out_shape):
+def verify_any_reshape(x_shape, newshape, x_np_shape, out_shape, 
variable_newshape=False):
 x = relay.var('x', shape=x_shape, dtype="float32")
+data = np.random.uniform(size=x_np_shape).astype('float32')
+params = [x]
+args = [data]
+
+if variable_newshape:
+newshape_var = relay.var('newshape', shape=(len(newshape),), 
dtype='int64')
+params.append(newshape_var)
+args.append(np.array(newshape, dtype='int64'))
+newshape = newshape_var
+
 y = relay.reshape(x, newshape=newshape)
 mod = tvm.IRModule()
-mod["main"] = relay.Function([x], y)
-data = np.random.uniform(size=x_np_shape).astype('float32')
+mod["main"] = relay.Function(params, y)
+
 for kind in ["debug", "vm"]:
 ex = relay.create_executor(kind, mod=mod, ctx=tvm.cpu(), target="llvm")
-result = ex.evaluate()(data).asnumpy()
+result = ex.evaluate()(*args).asnumpy()
 assert result.shape == out_shape
 tvm.testing.assert_allclose(result.flatten(), data.flatten())
 
 def test_any_reshape():
-verify_any_reshape(any_dims(3), (1, -1), (2, 3, 4), (1, 24))
-verify_any_reshape(any_dims(3), (0, -1), (2, 3, 4), (2, 12))
+for variable_newshape in [False, True]:
+# Variable newshape only supports that output rank is the same as 
newshape

Review comment:
   Do you mean add some dynamic shape cases to test_op_level3? Cases in 
that file use 'graph' exectuor now and test like test_arange() only supports 
constant inputs too.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [incubator-tvm] lixiaoquan commented on a change in pull request #5429: [RELAY][TF] Support symbolic newshape for Reshape

2020-05-05 Thread GitBox


lixiaoquan commented on a change in pull request #5429:
URL: https://github.com/apache/incubator-tvm/pull/5429#discussion_r420541295



##
File path: src/relay/transforms/pattern_util.h
##
@@ -270,7 +270,7 @@ inline Constant MakeConstantScalar(DataType dtype, T value) 
{
  */
 template 
 static inline Constant MakeConstantTensor(DataType dtype, std::vector 
shape,
-  std::vector value) {
+  T value) {

Review comment:
   Sorry for late. Updated





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [incubator-tvm] lixiaoquan commented on a change in pull request #5429: [RELAY][TF] Support symbolic newshape for Reshape

2020-05-01 Thread GitBox


lixiaoquan commented on a change in pull request #5429:
URL: https://github.com/apache/incubator-tvm/pull/5429#discussion_r418782618



##
File path: src/relay/op/tensor/transform.cc
##
@@ -504,17 +548,38 @@ bool ReshapeRel(const Array& types,
 return false;
   }
 
-  const auto* param = attrs.as();
+  Array oshape;
   Array data_shape;
   Array newshape;
-  if (param->reverse) {
-data_shape.assign(data->shape.rbegin(), data->shape.rend());
-newshape.assign(param->newshape.rbegin(), param->newshape.rend());
+  const ConstantNode *attr_newshape;
+
+  if ((attr_newshape = param->newshape.as())) {
+Array temp;
+
+CHECK_EQ(attr_newshape->data->ndim, 1);
+for (int i = 0; i < attr_newshape->data->shape[0]; i++) {
+  temp.push_back(Integer(static_cast(ToScalar(attr_newshape->data, 
i;
+}
+
+if (param->reverse) {
+  data_shape.assign(data->shape.rbegin(), data->shape.rend());
+  newshape.assign(temp.rbegin(), temp.rend());
+} else {
+  data_shape = data->shape;
+  newshape = temp;
+}
   } else {
-data_shape = data->shape;
-newshape = param->newshape;
+const auto* newshape = types[1].as();
+
+// Doesn't support dynamic output rank
+for (int i = 0; i < newshape->shape[0].as()->value; i++) {

Review comment:
   If special values are from runtime, we can't know them in ReshapeRel. I 
think there is limitation for non constant newshape. As long as output's rank 
is the same as newshape, some special values still work





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [incubator-tvm] lixiaoquan commented on a change in pull request #5429: [RELAY][TF] Support symbolic newshape for Reshape

2020-05-01 Thread GitBox


lixiaoquan commented on a change in pull request #5429:
URL: https://github.com/apache/incubator-tvm/pull/5429#discussion_r418780844



##
File path: src/relay/transforms/fuse_ops.cc
##
@@ -234,6 +235,8 @@ class IndexedForwardGraph::Creator : private ExprVisitor {
 Node* node = graph_.node_map.at(call);
 static auto fpattern =
 Op::GetAttr("TOpPattern");
+static auto tshape_dependant = Op::GetAttr(
+"TShapeDataDependant");

Review comment:
   Yes, this is used to prevent a result of shape function from being fed 
to a data dependent shape function.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [incubator-tvm] lixiaoquan commented on a change in pull request #5429: [RELAY][TF] Support symbolic newshape for Reshape

2020-04-30 Thread GitBox


lixiaoquan commented on a change in pull request #5429:
URL: https://github.com/apache/incubator-tvm/pull/5429#discussion_r418341379



##
File path: include/tvm/relay/op_attr_types.h
##
@@ -81,10 +81,16 @@ using TOpIsStateful = bool;
  */
 using TNonComputational = bool;
 
+enum ShapeDependantKind {
+  kShapeDependantShape = 0,
+  kShapeDependantData = 1,
+  kShapeDependantBoth = 2,

Review comment:
   @icemelon9 This is solved, could you please take another look?





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [incubator-tvm] lixiaoquan commented on a change in pull request #5429: [RELAY][TF] Support symbolic newshape for Reshape

2020-04-27 Thread GitBox


lixiaoquan commented on a change in pull request #5429:
URL: https://github.com/apache/incubator-tvm/pull/5429#discussion_r415554589



##
File path: include/tvm/relay/op_attr_types.h
##
@@ -81,10 +81,16 @@ using TOpIsStateful = bool;
  */
 using TNonComputational = bool;
 
+enum ShapeDependantKind {
+  kShapeDependantShape = 0,
+  kShapeDependantData = 1,
+  kShapeDependantBoth = 2,

Review comment:
   Thanks, it's solved now





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [incubator-tvm] lixiaoquan commented on a change in pull request #5429: [RELAY][TF] Support symbolic newshape for Reshape

2020-04-26 Thread GitBox


lixiaoquan commented on a change in pull request #5429:
URL: https://github.com/apache/incubator-tvm/pull/5429#discussion_r415261490



##
File path: include/tvm/relay/op_attr_types.h
##
@@ -81,10 +81,16 @@ using TOpIsStateful = bool;
  */
 using TNonComputational = bool;
 
+enum ShapeDependantKind {
+  kShapeDependantShape = 0,
+  kShapeDependantData = 1,
+  kShapeDependantBoth = 2,

Review comment:
   
https://github.com/apache/incubator-tvm/blob/6e1cd8256e461deca51e45e7aae33088e8f5b966/python/tvm/relay/op/_transform.py#L179
   This line uses a range() and I think it causes variable index.
   
   I tried to allocate a data shape tensor by tvm.te.placeholder(), but it 
can't be passed to tvm.lower() when this shape func is lowered.
   But data shape tensor allocated in this line will be stored in CachedNode 
object and sent to tvm.lower(), that's why I have to pass data shape tensor 
(https://github.com/apache/incubator-tvm/blob/2c1ca60ea38587401a20f11c5e64f452b79fa777/src/relay/backend/compile_engine.cc#L348)
   Please correct me if I missunderstood your suggestion about allocating a 
buffer.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [incubator-tvm] lixiaoquan commented on a change in pull request #5429: [RELAY][TF] Support symbolic newshape for Reshape

2020-04-26 Thread GitBox


lixiaoquan commented on a change in pull request #5429:
URL: https://github.com/apache/incubator-tvm/pull/5429#discussion_r415261490



##
File path: include/tvm/relay/op_attr_types.h
##
@@ -81,10 +81,16 @@ using TOpIsStateful = bool;
  */
 using TNonComputational = bool;
 
+enum ShapeDependantKind {
+  kShapeDependantShape = 0,
+  kShapeDependantData = 1,
+  kShapeDependantBoth = 2,

Review comment:
   
https://github.com/apache/incubator-tvm/blob/6e1cd8256e461deca51e45e7aae33088e8f5b966/python/tvm/relay/op/_transform.py#L179
   This line uses a range() and I think it causes variable index.
   
   I tried to allocate a data shape tensor by tvm.te.placeholder(), but it 
can't be passed to tvm.lower() when this shape func is lowered.
   But data shape tensor allocated in this line will be stored in CachedNode 
object 
(https://github.com/apache/incubator-tvm/blob/2c1ca60ea38587401a20f11c5e64f452b79fa777/src/relay/backend/compile_engine.cc#L348)
   Please correct me if I missunderstood your suggestion about allocating a 
buffer.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [incubator-tvm] lixiaoquan commented on a change in pull request #5429: [RELAY][TF] Support symbolic newshape for Reshape

2020-04-25 Thread GitBox


lixiaoquan commented on a change in pull request #5429:
URL: https://github.com/apache/incubator-tvm/pull/5429#discussion_r415197766



##
File path: include/tvm/relay/op_attr_types.h
##
@@ -81,10 +81,16 @@ using TOpIsStateful = bool;
  */
 using TNonComputational = bool;
 
+enum ShapeDependantKind {
+  kShapeDependantShape = 0,
+  kShapeDependantData = 1,
+  kShapeDependantBoth = 2,

Review comment:
   Thanks for you comment.
   
   I tried two ways to use data tensor's shape directly.
   First, I passed data tensor's shape to _reshape_shape_func but I got this 
error
   ```
   ValueError: All indices are supposed to be constants
   ```
   Second, I tried to construct a tensor by tvm.te.placeholder based on data 
tensor's shape, but it doesn't work because it can't be passed to tvm.lower 
called later.
   





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [incubator-tvm] lixiaoquan commented on a change in pull request #5429: [RELAY][TF] Support symbolic newshape for Reshape

2020-04-25 Thread GitBox


lixiaoquan commented on a change in pull request #5429:
URL: https://github.com/apache/incubator-tvm/pull/5429#discussion_r415196802



##
File path: include/tvm/relay/op_attr_types.h
##
@@ -81,10 +81,16 @@ using TOpIsStateful = bool;
  */
 using TNonComputational = bool;
 
+enum ShapeDependantKind {
+  kShapeDependantShape = 0,
+  kShapeDependantData = 1,
+  kShapeDependantBoth = 2,

Review comment:
   Thanks for your comment
   
   I checked #4312, and tried to use data.shape the same ways as 
_strided_slice_shape_func(), but I got this error from hybrid script parser
   ```
   ValueError: All indices are supposed to be constants
   ```
   it seems _strided_slice_shape_func only uses const index when accessing 
data.shape, but _reshape_shape_func uses variables as indices to access 
data_shape
   
   I'd appreciate it if you can give me some hint about how to solve this.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [incubator-tvm] lixiaoquan commented on a change in pull request #5429: [RELAY][TF] Support symbolic newshape for Reshape

2020-04-25 Thread GitBox


lixiaoquan commented on a change in pull request #5429:
URL: https://github.com/apache/incubator-tvm/pull/5429#discussion_r415196802



##
File path: include/tvm/relay/op_attr_types.h
##
@@ -81,10 +81,16 @@ using TOpIsStateful = bool;
  */
 using TNonComputational = bool;
 
+enum ShapeDependantKind {
+  kShapeDependantShape = 0,
+  kShapeDependantData = 1,
+  kShapeDependantBoth = 2,

Review comment:
   I checked #4312, and tried to use data.shape the same ways as 
_strided_slice_shape_func(), but I got this error from hybrid script parser
   ```
   ValueError: All indices are supposed to be constants
   ```
   it seems _strided_slice_shape_func only uses const index when accessing 
data.shape, but _reshape_shape_func uses variables as indices to access 
data_shape
   
   I'd appreciate it if you can give me some hint about how to solve this.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org