Hi, 

I am happy to announce three packages related to empirical risk minimization

EmpiricalRisks <https://github.com/lindahua/EmpiricalRisks.jl>

This Julia package provides a collection of predictors and loss functions, 
as well as the efficient computation of gradients, mainly to support the 
implementation of (regularized) empirical risk minimization methods.

Predictors:

   - linear prediction
   - affine prediction
   - multivariate linear prediction
   - multivariate affine prediction
   
Loss functions:

   - squared loss
   - absolute loss
   - quantile loss
   - huber loss
   - hinge loss
   - smoothed hinge loss
   - logistic loss
   - sum squared loss (for multivariate prediction)
   - multinomial logistic loss
   
Regularizers:

   - squared L2 regularization
   - L1 regularization
   - elastic net (L1 + squared L2)
   - evaluation of proximal operators, w.r.t. these regularizers.
   

Regression <https://github.com/lindahua/Regression.jl>

This package was dead before, and I revived it recently. It is based on 
EmpiricalRisks, and provides methods for regression analysis (for moderate 
size problems, i.e. the data can be loaded entirely to memory). It supports 
the following problems:

   - Linear regression
   - Ridge regression
   - LASSO
   - Logistic regression
   - Multinomial Logistic regression
   - Problems with customized loss and regularizers
   
It also provides a variety of solvers:

   - Analytical solution (for linear & ridge regression)
   - Gradient descent
   - BFGS
   - L-BFGS
   - Proximal gradient descent (recommended for LASSO & sparse regression)
   - Accelerated gradient descent (experimental)


SGDOptim <https://github.com/lindahua/SGDOptim.jl>

I announced this couple weeks ago. Now this package has been fundamentally 
refactored, and now it is based on EmpiricalRisks. It aims to provide 
stochastic algorithms (e.g. SGD) for solve large scale regression problems.


Cheers,
Dahua






Reply via email to