This looks pretty awesome. In terms of simplicity, almost too good to be
true, really.
One comment: it might be faster and/or more stable to use backslash to
solve a linear system instead of finding an explicit inverse, i.e. change
line 57 from
theta = xt_eta * inv(X * xt_eta .+ lambda_eye) * y
to
theta = xt_eta * ((X * xt_eta .+ lambda_eye) \ y)
On Wednesday, July 30, 2014 2:44:21 PM UTC-7, Robert Feldt wrote:
>
> Short note and pointer to Julia implementation of the (very recently
> proposed and beautifully simple) EM algorithm for regularized L0 regression
> (L0EM):
>
>
> https://github.com/robertfeldt/FeldtLib.jl/blob/master/spikes/L0EM_regularized_regression.jl
>
> that someone might find useful if doing regularized regression. I also
> experimented with an adaptive lambda (regularization weight) binary search
> scheme which can find interesting lambda values without having to run the
> normal set of 100 different values. It seems useful and I will extend it
> with log-spaced binary search in the near future.
>
> Worth noting is that the paper on L0EM found that just using AIC or BIC to
> select lambda was very efficient => cross validation might not be needed =>
> 500-1000 times speedup => useful on "big data"...
>
> I have several other regularized regression methods implemented in Julia
> so maybe time to collect them together into a lib. Anyone knows if there is
> something already out there (I have seen GLMNET.jl which wraps the fortran
> LASSO code)?
>
> Cheers,
>
> Robert Feldt
>