This is code that I used gmres function in Matlab.

function y = afun(x)
    y = [0; x(1:n-1)] + [x(2:n); 0];
end

n = 21;
x=ones(n,1);
b = afun(x);

v=gmres(@afun,-b)

but I do not know how to use it in julia. How to pass the afun into gmres 
when the input A (resp. Pl, Pr) is a function returning A*x. I tried it as 
below. Please help me!

function afun(x)
y = [0; x[1:n-1]] + [x[2:n]; 0];
return y  
end

n=21
x=ones(n,1)
b=afun(x)

v=IterativeSolvers.gmres(afun(x),-b)
Output: ERROR: LoadError: MethodError: `*` has no method matching 
*(::Array{Float64,1}, ::Array{Float64,1}) 


v=IterativeSolvers.gmres(afun,-b)
Output: ERROR: LoadError: MethodError: `one` has no method matching 
one(::Type{Any})

Thanks!

Vào 00:26:58 UTC+7 Chủ Nhật, ngày 13 tháng 10 năm 2013, Jason Pries đã viết:
>
> When I was working on my GMRES implementation, I noticed that calls to 
> "dot" were 13x faster if I avoided sub-indexing by storing the subspace 
> vectors as an "array of vectors" instead of in a matrix. Since most of the 
> work in GMRES occurs constructing the orthonormal basis (for a large inner 
> iterations) this speeds things up considerably. This will probably hold 
> true for any method which constructs an orthonormal basis.
> x    = rand(1000,2);
> y    = Array(Array{Float64,1},2);
> y[1] = rand(1000);
> y[2] = rand(1000);
> z    = rand(1000);
>
> tic();
> for i = 1:1e6
>      a = dot(x,y[1]);
> end
> toc();
>
> #elapsed time: 0.529180537 seconds
>
> for i = 1:1e6
>      a = dot(x,subs(z,1:1000,1));
> end
> toc();
>
> #elapsed time: 6.843639198 seconds
>
>
> In terms of interface, it makes sense to me to support supplying matrices 
> and preconditioners as a functions. As a trivial example, I would expect 
> both GMRES calls to be valid:
> A    = rand(1000,1000);
> L(x) = *(A,x);
> b    = rand(1000);
>
> x1 = gmres(A,b);
> x2 = gmres(L,b);
> The only real problem is with the different syntax (A*x, versus L(x)).
>
> I strongly feel the interface should support both "left" and "right" 
> preconditioners (where applicable). One thing that is a constant bother to 
> me with the MATLAB interface (to GMRES at least) is that it only supports 
> left preconditioners (i.e. solving   M*b = M*A*x), and not right 
> preconditioners (i.e. b = A*M*y, x = M*y). The problem with left 
> preconditioning is that instead of minimizing norm(A*x-b), norm(M*(A*x-b)) 
> is minimized instead. A lot of the time, norm(M*(A*x-b)) being small does 
> not necessarily translate to norm(A*x-b) being small. Right preconditioning 
> avoids this problem.
>
> I've tried to include these elements in my implementation 
> *https://github.com/priesj/KrylovSolvers.jl/blob/patch-1/src/solvers.jl* 
> <https://github.com/priesj/KrylovSolvers.jl/blob/patch-1/src/solvers.jl> 
> but admittedly, I'm fairly new to Julia. I have a feeling I may be doing 
> some superfluous things with types and/or breaking some conventions/best 
> practices.
>
> On Saturday, October 12, 2013 5:03:05 AM UTC-4, Andreas Noack Jensen wrote:
>
>> Unfortunately I have missed this thread and some double work has been 
>> done. I also needed some of this functionality in my thesis in which solve 
>> some large Toeplitz systems and I also settled on translating the MATLAB 
>> versions of the http://www.netlib.org/templates code instead of writing 
>> from scratch. I couldn't find any license information so I wrote an email 
>> to Jack Dongarra and it is all modified BSD. 
>>
>> I think that we should try to work together on this and find the best 
>> interface. Mine is still pretty rough, but what I have done so far is here 
>> https://github.com/andreasnoackjensen/IterativeLinearSolvers.jl. I have 
>> tried to make the implementations slightly more memory efficient and 
>> therefore I have introduced the allocation free functions 
>> A_mul_B!(α,A,x,β,y) which tries to mimic BLAS' gemv and solve!(A,b) which 
>> writes the solution to b. The problem is that my solvers doesn't work with 
>> Julia as it is right now because of these new functions. There are some 
>> issues open on allocation free versions of multiplication and system 
>> solving but no decisions have been made as far as I know.
>>
>> Regarding Steven G. Johnson's comments about AbstractMatrix, I think it 
>> would be useful to state more clearly how AbstractMatrix should be 
>> interpreted as I think there has been confusion about it more than once. 
>>
>

Reply via email to