It looks like the %. calculates the alpha and beta, and
then the +/ . * creates a set of values that are combinations
of y that are close to the given x.


I made some notes to myself on this topic that I would
like to put into the record here:



We will use the abbreviation mp for matrix product.  In J, mp =: +/ . *

For a square matrix y,   %. y  is the matrix inverse of y.

If y is not square, it must have more rows than columns, and
     %. y   is   (%. (|:y) mp y) mp (|: y)
   in math notation, %. y is   (YtY)'Yt    where '=inverse, t=transpose


x %. y is matrix division, which is defined as (%. y) mp x .

If x is a vector of observations and y is a matrix of explanatory variables,
x %. y gives the regression coefficients.

y mp x %. y  gives the projection of x onto the column space of y .


%. fails when the columns are dependent.  Regression using %. gives
poor results when the columns are almost dependent - use SVD,
part of the LAPACK addon, in that case.

Henry Rich 

> -----Original Message-----
> From: [EMAIL PROTECTED] 
> [mailto:[EMAIL PROTECTED] On Behalf Of June Kim
> Sent: Saturday, August 19, 2006 9:51 AM
> To: Programming forum
> Subject: [Jprogramming] Linear Least Squares
> 
> On 10E. Approximation section of J Dictionary, there is a definition
> for "linear least squares fit of x and y".
> 
> It's given as d1=: ] (] +/ .* %.) 1: ,. [
> 
> What is the expected way of using it, and what does the result mean? I
> first thought it returned the least squares estimators(beta and
> alpha), but it didn't.
> ----------------------------------------------------------------------
> For information about J forums see 
> http://www.jsoftware.com/forums.htm

----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to