All,

This means that all gmm related implementation are not correct ?

i checked and analyze the gaussiamhmm  it looks good so far.

regarding the gmmhmm , do we have an issue related to the aron message ?

Didier 

--- Original Message ---

From: Andreas Mueller <amuel...@ais.uni-bonn.de>
Sent: October 18, 2012 10/18/12
To: scikit-learn-general@lists.sourceforge.net
Subject: Re: [Scikit-learn-general] sklearn.mixture.DPGMM: Unexpected results

   Hi Aron.
 I think this might be an instance of this bug:
 https://github.com/scikit-learn/scikit-learn/issues/393
 Unfortunately this part of the scikit is in a very bad state.
 Sorry for making you wonder.
 
 I have been thinking about putting in a user warning earlier today.
 What do others think?
 This seems to be a serious issue that has been around for way to long!
 Best,
 Andy
 
 On 10/18/2012 06:57 PM, Aron Culotta wrote:
 
  
 The results I get from DPGMM are not what I expect. E.g.:

 >>> import sklearn.mixture >>> sklearn.__version__ '0.12-git' >>> data = 
 >>> [[1.1],[0.9],[1.0],[1.2],[1.0], [6.0],[6.1],[6.1]] >>> m = 
 >>> sklearn.mixture.DPGMM(n_components=5, n_iter=1000, alpha=1) >>> 
 >>> m.fit(data) DPGMM(alpha=1, covariance_type='diag', init_params='wmc', 
 >>> min_covar=None, n_components=5, n_iter=1000, params='wmc', 
 >>> random_state=<mtrand.RandomState object at 0x108a3f168>, thresh=0.01, 
 >>> verbose=False) >>> m.converged_ True >>> m.weights_ array([ 0.2, 0.2, 0.2, 
 >>> 0.2, 0.2]) >>> m.means_ array([[ 0.62019109], [ 1.16867356], [ 
 >>> 0.55713292], [ 0.36860511], [ 0.17886128]]) I expected the result to be 
 >>> more similar to the vanilla GMM; that is, two gaussians (around values 1 
 >>> and 6), with non-uniform weights (like [ 0.625, 0.375]). I expected the 
 >>> "unused" gaussians to have weights near zero.

 Am I using the model incorrectly?

 I've also tried changing alpha without any luck.

 I've also tried a different data in a smaller range with no luck: [[0.1], 
[0.2], [0.15], [0.112], [0.13], [0.8], [0.85], [0.79]]

 Thanks,

 Aron

 
 
  
 ------------------------------------------------------------------------------ 
Everyone hates slow websites. So do we. Make your web apps faster with 
AppDynamics Download AppDynamics Lite for free 
today:http://p.sf.net/sfu/appdyn_sfd2d_oct 
  
 _______________________________________________ Scikit-learn-general mailing 
listScikit-learn-general@lists.sourceforge.nethttps://lists.sourceforge.net/lists/listinfo/scikit-learn-general
  
 

------------------------------------------------------------------------------
Everyone hates slow websites. So do we.
Make your web apps faster with AppDynamics
Download AppDynamics Lite for free today:
http://p.sf.net/sfu/appdyn_sfd2d_oct
_______________________________________________
Scikit-learn-general mailing list
Scikit-learn-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to