## Description
In MxNet 2.0, we would like to provide a distribution module, analogous to 
Pytorch distribution. The main difference from theirs is that we use numpy op 
and it allows hybridization. The current project code can be seen from 
https://github.com/xidulu/incubator-mxnet/tree/distribution_dev/python/mxnet/gluon/probability.
 

The basic skeleton divides into following parts:

1. Stochastic `HybridBlock` and `HybridSequential`: they build upon gluon 
`HybridBlock` and `HybridSequential` and allows adding extra loss to each layer.
2. Distribution class: it implements a variety of functionalities including 
`prob`, `log_prob`, `sample`, `broadcast_to`, `mean`, `variance`, etc.
3. KL divergence: `kl_divergence(p, q)` function searches over registered KL 
divergence functions and performs computation.
4. Transform: transform one distribution to another invertible distribution.
5. Independent: reinterprets some of the batch dims of a distribution as event 
dims.

Two features that is currently either not supported or kind of broken in MxNet 
will be very useful to this projects: symbolic shape and control flow.

At the moment, we will implement most of distribution in frontend. We will move 
the computation to backend when new numpy probability ops such as `chisquare`, 
`dirichlet` and `multivariate_normal` are introduced into MxNet.

## References
- https://pytorch.org/docs/stable/distributions.html
- https://docs.scipy.org/doc/numpy-1.14.1/reference/routines.random.html

@xidulu @szha @leezu @haojin2 


-- 
You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17240

Reply via email to