Re: [scikit-learn] Why ridge regression can solve multicollinearity?

2020-01-08 Thread Brown J.B. via scikit-learn
Just for convenience:

Marquardt, Donald W., and Ronald D. Snee. "Ridge regression in practice." *The
> American Statistician* 29, no. 1 (1975): 3-20.
>

https://amstat.tandfonline.com/doi/abs/10.1080/00031305.1975.10479105
___
scikit-learn mailing list
scikit-learn@python.org
https://mail.python.org/mailman/listinfo/scikit-learn


Re: [scikit-learn] Why ridge regression can solve multicollinearity?

2020-01-08 Thread josef . pktd
On Wed, Jan 8, 2020 at 9:43 PM  wrote:

>
>
> On Wed, Jan 8, 2020 at 9:38 PM lampahome  wrote:
>
>>
>>
>> Stuart Reynolds  於 2020年1月9日 週四 上午10:33寫道:
>>
>>> Correlated features typically have the property that they are tending to
>>> be similarly predictive of the outcome.
>>>
>>> L1 and L2 are both a preference for low coefficients.
>>> If a coefficient can be reduced yet another coefficient maintains
>>> similar loss, the these regularization methods prefer this solution.
>>> If you use L1 or L2, you should mean and variance normalize your
>>> features.
>>>
>>>
>> You mean LASSO and RIDGE both solve multilinearity?
>>
>
> LASSO has the reputation not to be good when there is multicollinearity,
> that's why elastic net L1 + L2 was introduced, AFAIK
>
> With multicollinearity the length of the parameter vector, beta' beta, is
> too large and L2, Ridge shrinks it.
>

e.g.
Marquardt, Donald W., and Ronald D. Snee. "Ridge regression in practice." *The
American Statistician* 29, no. 1 (1975): 3-20.

I just went through it last week because of a argument about variance
inflation factor in Ridge




>
> Josef
>
>
>
>>
>> ___
>> scikit-learn mailing list
>> scikit-learn@python.org
>> https://mail.python.org/mailman/listinfo/scikit-learn
>>
>
___
scikit-learn mailing list
scikit-learn@python.org
https://mail.python.org/mailman/listinfo/scikit-learn


Re: [scikit-learn] Why ridge regression can solve multicollinearity?

2020-01-08 Thread josef . pktd
On Wed, Jan 8, 2020 at 9:38 PM lampahome  wrote:

>
>
> Stuart Reynolds  於 2020年1月9日 週四 上午10:33寫道:
>
>> Correlated features typically have the property that they are tending to
>> be similarly predictive of the outcome.
>>
>> L1 and L2 are both a preference for low coefficients.
>> If a coefficient can be reduced yet another coefficient maintains similar
>> loss, the these regularization methods prefer this solution.
>> If you use L1 or L2, you should mean and variance normalize your features.
>>
>>
> You mean LASSO and RIDGE both solve multilinearity?
>

LASSO has the reputation not to be good when there is multicollinearity,
that's why elastic net L1 + L2 was introduced, AFAIK

With multicollinearity the length of the parameter vector, beta' beta, is
too large and L2, Ridge shrinks it.

Josef



>
> ___
> scikit-learn mailing list
> scikit-learn@python.org
> https://mail.python.org/mailman/listinfo/scikit-learn
>
___
scikit-learn mailing list
scikit-learn@python.org
https://mail.python.org/mailman/listinfo/scikit-learn


Re: [scikit-learn] Why ridge regression can solve multicollinearity?

2020-01-08 Thread lampahome
Stuart Reynolds  於 2020年1月9日 週四 上午10:33寫道:

> Correlated features typically have the property that they are tending to
> be similarly predictive of the outcome.
>
> L1 and L2 are both a preference for low coefficients.
> If a coefficient can be reduced yet another coefficient maintains similar
> loss, the these regularization methods prefer this solution.
> If you use L1 or L2, you should mean and variance normalize your features.
>
>
You mean LASSO and RIDGE both solve multilinearity?
___
scikit-learn mailing list
scikit-learn@python.org
https://mail.python.org/mailman/listinfo/scikit-learn


Re: [scikit-learn] Why ridge regression can solve multicollinearity?

2020-01-08 Thread Stuart Reynolds
Correlated features typically have the property that they are tending to be
similarly predictive of the outcome.

L1 and L2 are both a preference for low coefficients.
If a coefficient can be reduced yet another coefficient maintains similar
loss, the these regularization methods prefer this solution.
If you use L1 or L2, you should mean and variance normalize your features.

On Wed, Jan 8, 2020 at 6:24 PM lampahome  wrote:

> I find out many blogs said that the l2 regularization solve
> multicollinearity, but they don't said how it works.
>
> I thought LASSO is able to select features by l1 regularization, maybe it
> also can solve this.
>
> Can anyone tell me how ridge works with multicollinearity great?
>
> thx
> ___
> scikit-learn mailing list
> scikit-learn@python.org
> https://mail.python.org/mailman/listinfo/scikit-learn
>
___
scikit-learn mailing list
scikit-learn@python.org
https://mail.python.org/mailman/listinfo/scikit-learn


[scikit-learn] Why ridge regression can solve multicollinearity?

2020-01-08 Thread lampahome
I find out many blogs said that the l2 regularization solve
multicollinearity, but they don't said how it works.

I thought LASSO is able to select features by l1 regularization, maybe it
also can solve this.

Can anyone tell me how ridge works with multicollinearity great?

thx
___
scikit-learn mailing list
scikit-learn@python.org
https://mail.python.org/mailman/listinfo/scikit-learn