kimm240 commented on PR #18508:
URL: https://github.com/apache/tvm/pull/18508#issuecomment-3579792073

   @wrongtest-intellif 
   This Pull Request (PR) extends the activation function support, following 
the footsteps of 
[https://github.com/apache/tvm/pull/18418](https://www.google.com/search?q=https://github.com/apache/tvm/pull/18418),
 which originally added ReLU support to the FuseReductionEpilogue primitive in 
the Apache TVM repository.
   
   Furthermore, this enhancement allows TVM to accurately recognize complex but 
standardized patterns, such as```max(temp + bias, 0)```, at the IR 
(Intermediate Representation) level, and reconstruct them using an optimized 
approach: the Per-iteration ReLU semantics.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to