Dear All,

Nice to meet you guys here! My name is Rui, a PhD student in University of 
Alberta. My research interests include distributed machine learning, low-rank 
matrix completion, and other exciting topics. I have been an MLPack user for a 
while in my research and I’m really fascinated by this great library and I’m 
willing to make contributions to this great machine learning framework.

I’m quite interested in “Low rank/sparse optimization using Frank-Wolfe". 
Frank-Wolfe is an old but famous algorithm in optimization. It has the same 
convergence rate with (projected) gradient descent, which is a more popular 
method in optimization. If it is not efficient to calculate the projection, 
Frank-Wolfe is one of the best alternative. Such example can be found in 
various papers, e.g., low-rank matrix completion and orthogonal matching 
pursuit. 

I think this algorithm can be implemented as a new optimizer since it can be 
used in may applications. My preliminary plan is to borrow some ideas from 
existing implementations of other optimizers for this project, and the major 
work is to implement the “argmin” step, or also called “conditional gradient” 
in the literature.

Looking forward to hearing from you! Thanks and have a good day!

Best regards,
Tyler Zhu (Rui)

_______________________________________________
mlpack mailing list
[email protected]
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack

Reply via email to