On Wednesday, July 30, 2014 11:08:45 PM UTC-4, Peter Simon wrote:

> Is there such a method in Julia?  Right now I'm multiplying both sides by 
> the transpose of the coefficient matrix to form and solve the normal 
> equations using \.  I suspect that this is not very efficient compared to a 
> sparse least squares solver.
>
> Thanks,
> --Peter
>

It depends on how large/sparse your overdetermined matrix A is. As 
factorizations go, you have a choice between performing the Cholesky 
factorization of A'A or the QR factorization of A. Some smart Cholesky   
Note that if A = QR, then A'A = R'R and R' is the Cholesky factor of A'A. 
However, it's often more stable to compute the QR factorization. The 
trouble with it is that if your A is large, Q is also large and is 
typically dense. In Matlab, there's a way to request the Q-less QR 
factorization:

  R = qr(A)

but I'm not seeing that in the Julia documentation. It might be there and 
not be apparent (to me). With R alone, you can solve the least-squares 
problem, but it's recommended to perform one step of iterative refinement.

If factorization isn't an option, the first thing to try is to run the 
conjugate gradient method on the normal equations (don't form them, just 
encode the appropriate matrix-vecor product). You may want to look at my 
linear operator library: https://github.com/dpo/linop.jl

There are specialized iterative methods for least-squares problems, that 
are typically more stable than CG (in the same way that QR is more stable 
than Cholesky). One of them is LSQR and is available in Julia 
here: https://github.com/JuliaLang/IterativeSolvers.jl/blob/master/src/lsqr.jl 
(I've never tested it). There are others.

I hope this helps.

Reply via email to