Conjugate Gradient Algorithm

1.Compute r0:=b Ax0, p0:=r0
2.For j =0;1; ....,until convergence Do:
3. aj :=(rj,rj)/(Apj,pj)
4.xj+1:= xj + ajpj
5.rj+1:=rj-ajApj
6.bj :=(rj+1,rj+1)/(rj,rj)
7.pj+1:=rj+1+bjpj
8.End Do


Derivation for two steps in one go

Compute r0:=bAx0, p0:=r0
For j=0;2;....,unitl convergence Do:
aj := (rj,rj)/ (Apj,pj)
xj+1:= xj + ajpj
rj+1:=rj-ajApj
bj :=(rj+1,rj+1)/(rj,rj)
pj+1:=rj+1+bjpj

aj+1 := (rj+1,rj+1)/ (APj+1,pj+1)
xj+2:= xj+1 + aj+1pj+1
rj+2:=rj+1-aj+1Apj+1
bj+1 :=(rj+2,rj+2)/(rj+1,rj+1)
pj+2:=rj+2+bj+1pj+1

End Do

All the equation must rely on the following initial variables:rj,pj,xj
and on constant Matrix A

Xj+2 have to be in function of rj,pj,xj and A it can't contain rj+1,pj
+1,xj+1 and so on for rj+2, xj+2

Some explanation:

(rj,rj) is scalar-product of vectors rj,rj
Apj is Matrix-vector multiplication where A is matrix and pj is a
vector
aj and bj are parametars

I hope now is more clear what I want to do...As I said the expressions
that I get are pretty big and I would like to use SAGE to try to
simplify and make more compact. In case that I want to do 3 or more
steps in one go the expressions would become even more bigger..

Thanks
Branimir

On Sep 8, 3:23 pm, Jason Grout <[email protected]> wrote:
> On 9/8/10 7:45 AM, BSC-BCN wrote:
>
> I hope you can understand
>
> > me...
>
> I don't think I really do, but this thread might contain some useful
> things for you:
>
> http://groups.google.com/group/sage-devel/browse_thread/thread/cafb48...
>
> Thanks,
>
> Jason

-- 
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/sage-support
URL: http://www.sagemath.org

Reply via email to