anijain2305 commented on issue #4508: Broadcasting is broken with 
LayoutTransform
URL: https://github.com/apache/incubator-tvm/issues/4508#issuecomment-568597152
 
 
   https://github.com/apache/incubator-tvm/pull/4577
   
   This is not just expand dims problem. The input shape is (1,) - if we take 
it from C to NCHW8c. Then , we first need to `repeat` to make the input shape 
atleast (8,). 
   
   Instead, a simpler way is to prevent the insertion of expand dims and layout 
transforms altogether for scalars. Scalars can be easily broadcasted even if 
the second tensor is transformed in layouts. The above PR does just that.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to