roastduck commented on issue #848:
URL: https://github.com/apache/tvm/issues/848#issuecomment-745041181


   Sorry for bringing up the issue again. I think the top-level-reduction-only 
restriction still exists in TVM today.
   
   I noticed that `relay.transform.transform.FuseOps` fuses a `conv2d` and a 
following `relu` into one single function. If I understand correctly, this 
means, in the underlying TOPI implementation, there will be a `relu` operation 
outside the reduction from `conv2d` (scheduled via `inline`, `compute_at` or 
something like that). Such as:
   
   ```
   for (...) {
     for (...) {
       x += ... // conv2d
     }
     x = relu(x) // relu
   }
   ```
   
   So the reduction loop is no longer at top level any more.
   
   Is the aforementioned fusion really meaningful? Or is the fusion only used 
for third-party backends such as cuDNN?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to