leandrolcampos opened a new issue #18140:
URL: https://github.com/apache/incubator-mxnet/issues/18140


   ## Description
   I'd like to suggest the implementation of implicit reparameterization 
gradients, as described in the paper [1], for the Gamma distribution: 
ndarray.sample_gamma and symbol.sample_gamma. 
   
   This will allow this distribution and others that depend on it, like Beta, 
Dirichlet and Student t distributions, to be used as easily as the Normal 
distribution in stochastic computation graphs.
   
   Stochastic computation graphs are necessary for variational autoenecoders 
(VAEs), automatic variational inference, Bayesian learning in neural networks, 
and principled regularization in deep networks.
   
   The proposed approach in the paper [1] is the same used in the TensorFlow's 
method tf.random.gamma, as we can see in [2].
   
   Thanks for the opportunity to request this feature.
   
   ## References
   - [1] [Michael Figurnov, Shakir Mohamed, Andriy Mnih. Implicit 
Reparameterization Gradients, 2018](https://arxiv.org/pdf/1805.08498.pdf)
   - [2] 
[tf.random.gamma](https://www.tensorflow.org/api_docs/python/tf/random/gamma)
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to