I've had a look at forsub this afternoon. I've cribbed Raul's
pre-factoring by
the matrix-diagonal, but otherwise they're my own ideas. Nothing out of
the ordinary, but fairly concise. "J-like"? - perhaps.
NB. these both assume that #x = #y
NB. Thunderbird will probably spoil the appearance
NB. successively append new elements to new vector z
forsuba=: 4 : 0
a =. x%d=.(+=&0)x|:~<0 1
b =. y%d
z =. ''
for_i. i.#b do.
z =. z, (i{b) - +/ z * i{. i{ a
end.
)
NB. overwrite b in place
NB. saves no of variables but needs to keep indexed assignment
forsubb=: 4 : 0
a =. x%d=.(+=&0)x|:~<0 1
b =. y%d
for_i. i.#b do. NB. could keep }.i.#b
b =. ((i{b) - +/ i{. b * i { a) i } b
end.
)
They're both marginally faster than forsub though nothing to
get excited about! I didn't find much change in performance
with the +/ . * form of sum-product. But they are quite
nicely concise. I feel there's a closed form hiding in there
somewhere, to replace the explicit loop, but haven't found
it yet!
Mike
On 02/09/2015 04:42, Thomas McGuire wrote:
Thanks I will try out your changes. Though in my implementation I was trying to
avoid multiplying the zero entries and your routines ignore that:
z1=. (i{b) - (i{a) +/ .* z
Here multiplying by all elements of a row and letting the zeros get rid of the
unnecessary data in z in that particular step. It's certainly cleaner but I
will have to try out a comparison to see if it matters.
What I was hoping for was to pull out the triangular data above the diagonal
and operate on that but I can't find a nice terse J command to do that like you
would have if I was pulling out the diagonal with oblique.
I haven't had a chance to completely work out the timing efficiencies of the
original routines. But a quick check using 7x7 binomial numbers matrix:
] A=: !/~ i.7x
1 1 1 1 1 1 1
0 1 2 3 4 5 6
0 0 1 3 6 10 15
0 0 0 1 4 10 20
0 0 0 0 1 5 15
0 0 0 0 0 1 6
0 0 0 0 0 0 1
The timing of the original algorithm
timespacex 'A baksub (1+i.7)'
9.6e_5 14848
timespacex '(|:)A forsub (1+i.7)'
8.9e_5 12416
9.6e_5 + 8.9e_5
0.000185
timespacex '(1+i.7) %. A'
0.00067 30976
There seems to be some divergence as the size of the matrix increases (whether
it is enough to worry about I haven't determined yet).
The real reason I am using it is I am trying to implement a non-linear least
squares(NLLS) algorithm from R in J (see )
They have a routine that uses CR algorithm to implement NLLS and then because
you end up with upper and lower triangular matrices uses the forward
substitution and back substitution to solve the intermediate matrix equations.
Tom McGuire
On Sep 1, 2015, at 10:54 AM, Raul Miller <rauldmil...@gmail.com> wrote:
I'm not sure I'd get rid of the for loop.
That said, here's a rephrase to make it more "j-like":
forsub=: 4 :0
d=. (<0 1)|:x
a=. x%d
b=. y%d
z=. (#b){.{.b
for_i.}.i.#b do.
z1=. (i{b) - (i{a) +/ .* z
z=. z1 i} z
end.
)
NB. Back substitution - multiply lower triangular matrix by vector
NB. Usage: matrix baksub vector
baksub =: 4 : 0
D=. (<0 1)|:x
B =. x%D
Y =. y%D
z =. (-#Y){.{:Y
for_i. }. i.-#Y do.
z1 =. (i{Y) - (i{B)+/ .* z
z =. z1 i} z
end.
)
Note that I have changed the algorithm slightly from your
implementation. So try it out on some other test cases to make sure
that I have not introduced any errors...
And of course, note also:
U1 baksub L1 forsub b1
1 _1 2
(|."1 U1) forsub&.|. L1 forsub b1
1 _1 2
That said, if you are doing this purely for efficiency reasons, those
efficiencies don't show up in your test data. (Running each timing
test twice to hint at some of the timing variances which can come from
outside influences such as OS scheduling):
timespacex 'b1 %. A1'
2.3e_5 7168
timespacex 'b1 %. A1'
3.5e_5 7168
timespacex 'U1 baksub L1 forsub b1'
6.2e_5 9344
timespacex 'U1 baksub L1 forsub b1'
3e_5 9344
Or, using your original definitions for forsub and baksub:
timespacex 'U1 baksub L1 forsub b1'
5.3e_5 11392
timespacex 'U1 baksub L1 forsub b1'
9.3e_5 11392
Thanks,
--
Raul
On Tue, Sep 1, 2015 at 1:47 AM, Thomas McGuire <tmcguir...@gmail.com> wrote:
Sorry I should have rerun the stuff rather than just pick through my debugging
session.
The original Matrix
]A1 =: 3 3$1 2 3 2 _4 6 3 _9 _3
1 2 3
2 _4 6
3 _9 _3
LU decomposition triangular matrix L1 and U1 are
L1
1 0 0
2 _8 0
3 _15 _12
U1
1 2 3
0 1 0
0 0 1
b1 =: 5 18 6 NB. this is the right hand side of a system of equations of
the for Ax = b
Matrix divide solution
b1 %. A1
1 _1 2
Solution using the LU decomposition matrices
]y1 =: L1 forsub b1
5 _1 2
U1 baksub y1
1 _1 2
Tom McGuire
On Aug 31, 2015, at 10:23 PM, Raul Miller <rauldmil...@gmail.com> wrote:
Can you double check whether you have posted what you intended?
I get
b1 %. A1
4.15625 1.375 _0.635417
L1 forsub b1
5 1.375 _0.635417
U1 baksub L1 forsub b1
0.517014 0.941667 _0.0635417
which does not correspond to the result you displayed.
Thanks,
--
Raul
On Mon, Aug 31, 2015 at 9:54 PM, Thomas McGuire <tmcguir...@gmail.com> wrote:
I have been playing around with creating forward substitution and backward
substitution J scripts to solve lower and upper triangular matrices of the type
you would get by performing LU decomposition on a single square matrix. I could
certainly just use matrix divide but this starts to slow down as the size of
the matrix NxN gets bigger. I have come up with a partial J like solution but
the script requires a for loop. I was wondering if someone has a mor J-ish
solution? Here is what I have done so far.
BACKGROUND and SCRIPTS:
From the LU decomp paper on the J website:
http://www.jsoftware.com/jwiki/Essays/LU%20Decomposition
There is a routine to perform the LU decomposition of a matrix.
Rather than use this though I cheated and borrowed a worked example from the
web:
The original matrix before LU
]A1 =: 3 3$ 1 2 3 2 _4 6 3 _9 _3
1 2 3
2 _4 6
3 _9 _3
b1 =: 5 _1 2
]L1 =: 3 3$1 0 0 2 _8 0 3 _15 _12
1 0 0
2 _8 0
3 _15 _12
]U1 =: 3 3$3 4 5 0 2 8 0 0 10
3 4 5
0 2 8
0 0 10
The forward substitution and backward substitution routines are as follows:
NB. Forward substitution - multiply upper triangular matrix by vector
NB. Usage: matrix forsub vector
forsub =: 4 : 0
a =. x
b =. y
z =. (#b)$z =. 0 0 0
z =. ((0 {b)% (<0 0) {a) 0} z
for_i. 1 + i. (#b)-1 do.
z1 =.((<i,i ){a)%~ ( (i{b) - (+/ (i{. (i {a))*(i{. z) ) )
z =. z1 i}z
end.
z
)
NB. Back substitution - multiply lower triangular matrix by vector
NB. Usage: matrix baksub vector
baksub =: 4 : 0
B =. x
Y =. y
N =. _1 + #Y
z =. (N+1)$0
z =. ((N {Y)% (<N,N) {B) N} z
for_i. |. i. N do.
z1 =. ((<i,i){B)%~ ((i{Y) - (+/ ((i+1)}. (i {B))*((i+1)}. z)))
z =. z1 i} z
end.
z
)
Solving linear equations with just matrix divide.
b1 %. A1
1 _1 2
To solve this set of linear equations with the L and U decomps you do the
following:
]y1 =: L1 forsub b1
5 _1 2
]x1 =: U1 baksub y1
1 _1 2
which agrees with the matrix divide version
These matrices were taken from the internet paper found here:
https://www.utdallas.edu/dept/abp/PDF_Files/LinearAlgebra_Folder/LU_Decomposition.pdf
---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm