Wow, fast responses, thanks!

On 02/02/2015, Andy <t3k...@gmail.com> wrote:
> I don't see how this would fit into the standard sklearn interface...

Just what Alexander said. I don't know sklearn well enough to know
what the standard pattern for interfacing might be, or why this
presents a problem. I was expecting to be able to do something like

gmm = scikit-learn.mixture.GMM(n_components=comp, covariance_type='full')
gmm.fit(trainingSet)
gmm_conditional=gmm.getConditional(axes=np.array([0,1]),
values=np.array([1.0,5.3]))

producing a whole new GMM object with a reduced number of features,
and its own mu, sigma, etc.

>  From a probabilistic modelling perspective that is indeed quite basic,
> but sklearn is not a probabilistic modelling framework ;)

GMMs are probabilistic modelling, and sklearn implements them...

If sklearn isn't likely to integrate this feature in future, could you
recommend a nice mature toolkit with good documentation which would? I
was really hoping to keep using sklearn for this, since I've found it
very nice to work with so far.

------------------------------------------------------------------------------
Dive into the World of Parallel Programming. The Go Parallel Website,
sponsored by Intel and developed in partnership with Slashdot Media, is your
hub for all things parallel software development, from weekly thought
leadership blogs to news, videos, case studies, tutorials and more. Take a
look and join the conversation now. http://goparallel.sourceforge.net/
_______________________________________________
Scikit-learn-general mailing list
Scikit-learn-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to