I agree. I had added sth like that to the original version in mlxtend (not sure
if it was before or after we ported it to sklearn). In at case though, it be
happy to open a PR about that later today :)
Best,
Sebastian
> On Oct 7, 2017, at 10:53 AM, Andreas Mueller wrote:
>
> For some reason
For some reason I thought we had a "prefit" parameter.
I think we should.
On 10/01/2017 07:39 PM, Sebastian Raschka wrote:
Hi, Rares,
vc = VotingClassifier(...)
vc.estimators_ = [e1, e2, ...]
vc.le_ = ...
vc.predict(...)
But I am not sure it is recommended to modify the "private" estimators
Hi, Rares,
> vc = VotingClassifier(...)
> vc.estimators_ = [e1, e2, ...]
> vc.le_ = ...
> vc.predict(...)
>
> But I am not sure it is recommended to modify the "private" estimators_ and
> le_ attributes.
I think that this may work if you don't call the fit method of the
VotingClassifier after
> > I am looking at VotingClassifier but it seems that it is expected that
the estimators are fitted when VotingClassifier.fit() is called. I don't
see how I can have already fitted classifiers combined under a
VotingClassifier.
>
> I think the opposite is true: The classifiers provided via an
`est
Hi, Rares,
> I am looking at VotingClassifier but it seems that it is expected that the
> estimators are fitted when VotingClassifier.fit() is called. I don't see how
> I can have already fitted classifiers combined under a VotingClassifier.
I think the opposite is true: The classifiers provide
Hello,
I have a distributed setup where subsets of the data is available at
different hosts. I plan to have each host fit a model with the subset of
the data it owns. Once these individual models are fitted, how can I go
about and combine them under one model.
I don't have a preference on a speci