sunggg opened a new pull request, #14493:
URL: https://github.com/apache/tvm/pull/14493

   This PR brings `ScatterElement` op that can be useful for preprocessing 
argument tensors in the frontend. 
   For example, for `topi::dynamic_strided_slice` expects its argument tensors 
(e.g., `begin`, `end`) to have the same rank with the data tensor. (see 
[link](https://github.com/apache/tvm/blob/unity/include/tvm/topi/transform.h#L657-L661))
   ```
   inline Tensor dynamic_strided_slice(const Tensor& x, const Array<PrimExpr>& 
begin,
                                       const Array<PrimExpr>& end, const 
Array<PrimExpr>& strides,
                                       std::string name = 
"T_dynamic_strided_slice",
                                       std::string tag = kInjective) {
     const size_t src_tensor_dim = x->shape.size();
     ICHECK_LE(begin.size(), src_tensor_dim);
     ICHECK_LE(end.size(), src_tensor_dim);
     ICHECK_LE(strides.size(), src_tensor_dim);
     ICHECK_EQ(begin.size(), end.size());
     ICHECK_EQ(begin.size(), strides.size());
     ...
   ```
   Thus, frontend often needs to preprocess those argument tensors to make sure 
the later constraints are satisfied. 
   
   For example, say if we have the following situation: 
   ```
   data_rank = 4
   axes = [2,0]
   begin = [1,2]
   # We want to preprocess `begin` to 
   new_begin = [2, 0, 1, 0]  
   ```
   With `ScatterElement`, we can conveniently do so. 
   ```Python
   new_begin = R.scatter_elements(
              relax.const([0] * data_rank),   # data
              axes,    # indices 
              begin,   # updates
              axis=0, 
              reduction="update")
   )
   ```
   
   cc. @jwfromm @yongwww @psrivas2 
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to