Hello all,
before attempting a detailed proposal I would like to discuss the big
picture with you. I went though the two referenced papers and my
feeling is that glmnet as coordinate descent method could be a good
choice especially since the connection with strong rule approach is
already available.
I'm currently programming a lot in R which could turn out to be
useful, but I haven't looked at the glmnet implementation yet.
I assume that proposing to implement glmnet incl. strong rule
filtering is not detailed enough. It would be great if you could give
me some advice on how to proceed from here.
Do you have a suggestion where I could start in order to show you that
I'm up to the task? It should be something that can be accomplished in
a limited amount of time.
I can't start serious coding yet since I'm currently deadlining with
my final (diploma) thesis. On the upside I will have no other
obligations during the whole GSOC coding time.
best,
Immanuel
coordinate descent methods:
- CDN,
decomposition method solving the sub-problem by Newton direction with
line search
possible speed ups:
- random permutation of sub-problems
- shrinking
loss functions: (convex, twice differentiable and nonnegative)
+ log
- L1
(+) can be extended to L2
penalty terms:
- L1
- L2
- glmnet,
loss function:
- log-linear
- L2
penalty terms:
- L1
- L2
- elasticnet
- strong rules,
- reduce #features
- check KKT to guaranty optimal solution
apply to:
- can be adapted for general convex optimization problems
- lasso, elastic net, logistic regression
------------------------------------------------------------------------------
This SF email is sponsosred by:
Try Windows Azure free for 90 days Click Here
http://p.sf.net/sfu/sfd2d-msazure
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general