sunggg opened a new pull request, #14447:
URL: https://github.com/apache/tvm/pull/14447

   In Unity, we have a clear distinction between `tensor` and `shape`: we have 
`ShapeExpr` and `ShapeStructInfo` in AST, `ShapeTuple` in runtime container. 
Meanwhile, most of operators and their TOPI implementations are defined with 
tensors. For example, `relax.take` is defined as follows:
   
   ```
   // Operator definition
   TVM_REGISTER_OP("relax.take")
       .set_attrs_type<TakeAttrs>()
       .set_num_inputs(2)
       .add_argument("x", "Tensor", "The source tensor.")
       .add_argument("indices", "Tensor", "The indices of the values to 
extract.")
       .set_attr<FInferStructInfo>("FInferStructInfo", InferStructInfoTake);
          
   StructInfo InferStructInfoTake(const Call& call, const BlockBuilder& ctx) {
     Array<TensorStructInfo> input_sinfo = GetInputTensorStructInfo(call, ctx);
      // Assume inputs are `Tensor` 
     TensorStructInfo data_sinfo = input_sinfo[0];
     TensorStructInfo indices_sinfo = input_sinfo[1];
     ...
   }
   
   // TOPI
   inline Tensor take(const Tensor& a, const Tensor& indices, int batch_dims, 
int axis, ...)
   ```
   
   To allow the shape computation, this PR introduces a `shape_to_tensor` op 
that converts `ShapeTuple` to `NDArray` at runtime. This enables the common 
shape computation patterns like following:
   ```
   shape_var: R.Shape(ndim=4)
   lv: R.Tensor((4,), dtype="int64") = R.shape_to_tensor(shape_var)
   lv1: R.Tensor((1,), dtype="int64") = R.take(lv, indices, axis=0)
   lv2: R.Tensor((1, 1), dtype="int64") = R.expand_dims(lv1, axis=[0])
   gv: R.Tensor((1, 1), dtype="int64") = R.concat((lv2,), axis=0)
   ```
   
   Currently, this op requires special handling in `FoldConstant` pass since 
this pass is only able to evaluate TIR primfunc, not PackedFunc. Once we extend 
`FoldConstant` to enable `PackedFunc` evaluation, we should be able to remove 
these unnecessary special handling. 
   
   cc. @jwfromm @yongwww @psrivas2 @slyubomirsky
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to