So, I've been screwing around with neural nets, and built a learner with simple linear regression output nodes. The nature of this beast is you have to invert a big matrix which is close to singular. Because I am a big numerics nerd, I remembered to add a small number to the diagonal in order to get an answer .... buuuut ... QR decomposition is not so good for this.

I spent a few hours wondering why lapack gels and %./domino were not giving me a proper inverse, when I remembered that LU decomposition is the way to go for the "near singular" case.

So, if you ever find yourself adding small diagonals to an array when '%.' barfs at you, reach for

inv=:  [: gesv_jlapack_ ] ; [: =@i.

-it will work better.

-SL

----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to