Thanks for the distinction between dual and primal. Now I understand what I
was doing wrong.
G.
On Tue, Nov 18, 2014 at 11:57 AM, Michael Eickenberg <
[email protected]> wrote:
> dual=True / False does not change the result, only potentially the speed
> of the algorithm
>
> With an L2
dual=True / False does not change the result, only potentially the speed of
the algorithm
With an L2 penalty, dual=True is preferable when n_samples < n_features,
dual=False (i.e. "primal=True") is preferable when n_features < n_samples
With an L1 penalty, dual=True is simply not implemented.
So
Hi George.
A way to avoid the error is by using lists of grids, where each grid
only has valid combinations
param_grid = [{'dual': False, 'penalty': 'l1', 'C': 10. ** np.arange(-3,
3)}, {'dual': True, 'penalty':'l2', 'loss': 'l2', 'C': : 10. **
np.arange(-3, 3)}]
or something like that. (L
Yes: When you instantiate `LogisticRegression`, use the keyword argument
`dual=False`
The dual formulation does not permit any fancy penalties in feature space,
such as L1.
http://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html#sklearn.linear_model.LogisticR
I'm trying to perform a grid search cv for logistic regression but I
stumble on invalid combination of parameters:
ValueError: Unsupported set of arguments: penalty='l1' is only supported
when dual='false'., Parameters: penalty='l1', loss='lr', dual=True
Is there a way to avoid this problem?
Tha