Correlated features typically have the property that they are tending to be
similarly predictive of the outcome.

L1 and L2 are both a preference for low coefficients.
If a coefficient can be reduced yet another coefficient maintains similar
loss, the these regularization methods prefer this solution.
If you use L1 or L2, you should mean and variance normalize your features.

On Wed, Jan 8, 2020 at 6:24 PM lampahome <pahome.c...@mirlab.org> wrote:

> I find out many blogs said that the l2 regularization solve
> multicollinearity, but they don't said how it works.
>
> I thought LASSO is able to select features by l1 regularization, maybe it
> also can solve this.
>
> Can anyone tell me how ridge works with multicollinearity great?
>
> thx
> _______________________________________________
> scikit-learn mailing list
> scikit-learn@python.org
> https://mail.python.org/mailman/listinfo/scikit-learn
>
_______________________________________________
scikit-learn mailing list
scikit-learn@python.org
https://mail.python.org/mailman/listinfo/scikit-learn

Reply via email to