What if I use -pc_type asm? Will the communication be avoided in this special 
case?

Thanks,
Qin


________________________________
 From: Barry Smith <[email protected]>
To: Qin Lu <[email protected]> 
Cc: Petsc-users <[email protected]> 
Sent: Monday, August 25, 2014 8:19 PM
Subject: Re: [petsc-users] Solve of subdomains without connections
 


On Aug 25, 2014, at 5:18 PM, Qin Lu <[email protected]> wrote:

> Hello,
>  
> I am using PETSc ksp solver to solve a problem on a physical domain. The 
> domain is splitted to subdomains in such a way that there is no connections 
> between them, but I still have to solve the whole domain as a single linear 
> system. My questions are:
>  
> 1. Does PETSc detect that the matrix is a block diagonal matrix and solve it 
> efficiently?
> 2. In parallel solve, each subdoamin is assigned to a separate process. Does 
> PETSc solve the system efficiently by avoiding all the unnecessary parallel 
> message passing since there is no connections between processes?

   If you use block Jacobi preconditioner then there will be no communication 
during the matrix-vector product nor the preconditioner. However the global 
reductions for the default Krylov method GMRES will still occur.  To eliminate 
the global reductions use for a solve

   -ksp_type preonly -pc_type bjacobi   -sub_ksp_type gmres (or whatever Krylov 
method you want on each process) -sub_pc_type ilu (or whatever preconditioner 
you want on each process).

   Now there will be no communication during the linear solve.

  Barry


>  
> Thanks,
> Qin
> 
> 

Reply via email to