YuhengHuang42 opened a new pull request #8192:
URL: https://github.com/apache/tvm/pull/8192


   This PR tries to fix PReLU bug mentioned in 
https://github.com/apache/tvm/issues/8184
   Reference: 
https://pytorch.org/docs/stable/generated/torch.nn.PReLU.html#torch.nn.PReLU
   
   In short, there are two situations we need to consider:
   1. When input dims < 2. PReLU in Pytorch can deal with this, while in TVM 
the default axis of `tvm.relay.nn.prelu` is set to 1, we will encounter an 
error when directly calling this function.
   2. When input dims >= 2, input channel > 1 and num_parameters = 1. In this 
case will need to do a broadcasting on the alpha parameter.
   
   @masahi  Please take a look at this PR, thanks!
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to