arcadiaphy commented on issue #15007: Add matrix determinant operator in linalg
URL: https://github.com/apache/incubator-mxnet/pull/15007#issuecomment-496617653
 
 
   @mseeger I think many ordinary users are not experts in linear algebra, so 
providing a set of basic linear blas/lapack routines and some advanced 
operators at the same time is the best way for all levels of users. My reason 
to improve linalg package comes from my daily usage and some users' feature 
requests. I agree that using basic operators is more efficient, but sometimes 
easy usage is more important and it doesn't matter if the computation is not 
fast enough as long as I've quickly finished the implementation.
   
   Also for using python to create some derived operators, actually it's what I 
want to do in the first place but is restricted by the weak supports of mxnet 
on registering backward function. Using fine-grained operators to mimic high 
level operation is all right for forward pass, but the backward pass will be 
terrible because the combined simple gradient computation is split into 
backward passes of basic operators. Not the only way to overwrite backward 
function in mxnet is Custom operator, which is not elegant. In pytorch, it's 
more user-and-developer-friendly to do this, and the following is the backward 
of matrix inverse:
   ```
   - name: inverse(Tensor self)
     self: -at::matmul(result.transpose(-2, -1), at::matmul(grad, 
result.transpose(-2, -1)))
   ```
   
   I think a good linalg package is like scipy.linalg, in which all routines of 
blas and lapack are exposed, and there are also a lot of high level functions.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to