Thanks, I was working on something similar, and this is very interesting, and 
much nicer written than mine. Although I haven't read all of it yet (it is 
quite dense), just comparing briefly to the Scikit-Learn implementation, 
(see: 
https://github.com/scikit-learn/scikit-learn/blob/master/sklearn/mixture/gaussian_mixture.py
 )
they use several possible covariance types
spherical, full, diagonal, tied,
I think based on this paper: https://www.cs.ubc.ca/~murphyk/Papers/learncg.pdf
It seems your implementation is using only spherical type. 

Also, I am wondering why you tested on simulated data - taken from normal 
distributions. Have you tried testing on real world datasets?

Thanks,
Jon
--------------------------------------------
On Fri, 1/12/18, Pierre-Edouard PORTIER <[email protected]> 
wrote:

 Subject: [Jprogramming] JGMM, Mixture Model in J
 To: [email protected]
 Date: Friday, January 12, 2018, 10:10 PM
 
 Mixture models in J:
 
 http://peportier.me/blog/201801_JGMM/ <http://peportier.me/blog/201801_JGMM/>
 
 Enjoy your weekend,
 
 Pierre-Edouard
 ----------------------------------------------------------------------
 For information about J forums see http://www.jsoftware.com/forums.htm
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to