[ 
https://issues.apache.org/jira/browse/SPARK-1157?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

DB Tsai updated SPARK-1157:
---------------------------

    Description: 
L-BFGS (Limited-memory BFGS) is an optimization algorithm like BFGS which uses 
an approximation to the inverse of Hessian matrix to steer its search through 
the variable space, but where BFGS stores a dense nxn approximation to the 
inverse Hessian, L-BFGS only stores a few vectors to represent the 
approximation.

For high dimensional optimization problems, the Newton method or BFGS is not 
applicable since the amount of memory needed to store the Hessian will grow 
exponentially, while L-BFGS only stores couple vectors. 

One of the use case can be training large-scale logistic regression with so 
many features.

We'll use breeze implementation of L-BFGS.

  was:
L-BFGS (Limited-memory BFGS) is an optimization algorithm like BFGS which uses 
an approximation to the inverse of Hessian matrix to steer its search through 
the variable space, but where BFGS stores a dense nxn approximation to the 
inverse Hessian, L-BFGS only stores a few vectors to represent the 
approximation.

For high dimensional optimization problems, the Newton method or BFGS is not 
applicable since the amount of memory needed to store the Hessian will grow 
exponentially, while L-BFGS only stores couple vectors. 

One of the use case can be training large-scale logistic regression with so 
many features.

This will use the L-BFGS java implementation from [RISO 
project|http://riso.sourceforge.net/] (published in maven central) which is 
direct translation version from the original robust Fortran implementation. 
(Thanks to the author of L-BFGS java implementation, Robert relicensed his code 
to commercial friendly Apache 2 license.)


> L-BFGS Optimizer
> ----------------
>
>                 Key: SPARK-1157
>                 URL: https://issues.apache.org/jira/browse/SPARK-1157
>             Project: Spark
>          Issue Type: New Feature
>            Reporter: DB Tsai
>
> L-BFGS (Limited-memory BFGS) is an optimization algorithm like BFGS which 
> uses an approximation to the inverse of Hessian matrix to steer its search 
> through the variable space, but where BFGS stores a dense nxn approximation 
> to the inverse Hessian, L-BFGS only stores a few vectors to represent the 
> approximation.
> For high dimensional optimization problems, the Newton method or BFGS is not 
> applicable since the amount of memory needed to store the Hessian will grow 
> exponentially, while L-BFGS only stores couple vectors. 
> One of the use case can be training large-scale logistic regression with so 
> many features.
> We'll use breeze implementation of L-BFGS.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to