sxjscience commented on issue #18043: [Performance][Numpy] np.einsum can be 500 - 1000 times slower than torch.einsum URL: https://github.com/apache/incubator-mxnet/issues/18043#issuecomment-613053572 @ptrendx Would you have time to take a look? I'm planning to use it in the numpy version of GluonNLP and find that our einsum operator's performance is not so good.
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services
