Re: [scikit-learn] Combine already fitted models

2017-10-07 Thread Andreas Mueller
For some reason I thought we had a "prefit" parameter. I think we should. On 10/01/2017 07:39 PM, Sebastian Raschka wrote: Hi, Rares, vc = VotingClassifier(...) vc.estimators_ = [e1, e2, ...] vc.le_ = ... vc.predict(...) But I am not sure it is recommended to modify the "private" estimators

Re: [scikit-learn] Combine already fitted models

2017-10-07 Thread Sebastian Raschka
I agree. I had added sth like that to the original version in mlxtend (not sure if it was before or after we ported it to sklearn). In at case though, it be happy to open a PR about that later today :) Best, Sebastian > On Oct 7, 2017, at 10:53 AM, Andreas Mueller wrote: > > For some reason

Re: [scikit-learn] question for using GridSearchCV on LocalOutlierFactor

2017-10-07 Thread Joel Nothman
I don't think LOF is designed to apply to unseen data. ___ scikit-learn mailing list scikit-learn@python.org https://mail.python.org/mailman/listinfo/scikit-learn

Re: [scikit-learn] Using perplexity from LatentDirichletAllocation for cross validation of Topic Models

2017-10-07 Thread Joel Nothman
just a note that if you're using this for topic modelling, perplexity might not be a good choice of objective function. others have been proposed. see the diagnostic functions for MALLET topic modelling for instance. ___ scikit-learn mailing list scikit-l

Re: [scikit-learn] question for using GridSearchCV on LocalOutlierFactor

2017-10-07 Thread Joel Nothman
actually I'm probably wrong there, but you may not be able to use accuracy ___ scikit-learn mailing list scikit-learn@python.org https://mail.python.org/mailman/listinfo/scikit-learn

[scikit-learn] Validating L2 - Least Squares - sum of squares, During a Normalization Function

2017-10-07 Thread Christopher Pfeifer
I am attempting to validate the output of an L2 normalization function: *data_l2 = preprocessing.normalize(data, norm='l2') *# raw data is below at end of this email output: array([[ 0.57649683, 0.53806371, 0.61492995], [-0.53806371, -0.57649683, -0.61492995], [ 0.3359268