Hi Manolo!

Your code looks nice, but my use case is a bit different. I have a mixed
set of parameters, some come from my wrapper,
and some from the wrapped estimator. The logic I am going for is something
like
"If you know about this parameter, then deal with it, if not, then pass it
along to the wrapped estimator and hope for the best!"
which is why I was asking Andreas about the use of `super`.

Joel is right that a mixin would be the natural way of adding
functionality, but unless I am getting something wrong that would
require me modifying the base classes from sklearn by either forking the
code (which sounds like a lot of trouble) or by
monkey-patching the import (also not an ideal solution).

There are several wrappers in scikit-learn that are similar in spirit to
what I am trying to do, for instalce `CalibratedClassifierCV`,
but neither of them deal with the `set_params` and `get_params` in a way
that delegates to the base estimator, which makes
them unsuitable for use in some third party tools such as BorutaPy [1]
which expects an estimator where
`estimator.set_params("n_estimators")` makes sense.

Cheers,
Javier


[1] https://github.com/scikit-learn-contrib/boruta_py

On Sat, Apr 14, 2018 at 5:10 PM Manuel CASTEJÓN LIMAS via scikit-learn <
scikit-learn@python.org> wrote:

> Hi Javier!
> Yo can have a look at:
>
> https://github.com/mcasl/PipeGraph/blob/master/pipegraph/adapters.py
>
> There are a few adapters there and I had tool deal with that situation. I
> solved it by using __getattr__ and __setattr__.
> Best
> Manolo
>
>
_______________________________________________
scikit-learn mailing list
scikit-learn@python.org
https://mail.python.org/mailman/listinfo/scikit-learn

Reply via email to