Hi Partrick.
The reason we don't support warm starts for LogisticRegression is that
we are using LibLinear and no-one was in the mood to hack on that yet ;)
It is a bit non-trivial C-hackery I am afraid.
Using Mathieu's library is definitely a good option.
Cheers,
Andy
On 05/17/2013 12:13 AM,
Hi,
My library lightning (https://github.com/mblondel/lightning) supports
L1-regularized logistic regression with warm start.
from lightning.primal_cd import CDClassifier
clf = CDClassifier(loss="log", penalty="l1", warm_start=True,
C=1.0/X.shape[0])
clf.tol = 1e-3
clf.alpha = 1e-3
clf.fit(X, y)
Hi all,
I'm performing logistic regression with an L1 penalty with
sklearn.linear_model.LogisticRegression. I'd like to get the entire
regularization path (from highly regularized to not regularized), similar
to lasso_path or lars_path.
However, I've verified that if you call train a second time