AndrewZhaoLuo commented on PR #11700:
URL: https://github.com/apache/tvm/pull/11700#issuecomment-1171542749

   Hmm yeah this makes sense, i would expect LUT to be slower than ReLU as it 
requires more memory access. 
   
   I suspect perhaps the activation functions just don't take much time? Really 
ReLU is close to the fastest you can go probably. You can maybe see the upper 
value for speedup by removing all activations.
   
   Still technically a little bit of improvement!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to