Re: [petsc-users] PETSc with 64 bit indices and MKL Sparse BLAS fails to build

2022-09-25 Thread Bro H
Barry, thanks, it appears to be working correctly after following your latest suggestion. I used the following configuration command: ./configure --force --prefix=/opt/libs/petsc COPTFLAGS="-O3" CXXOPTFLAGS="-O3" FOPTFLAGS="-O3" --with-precision=double --with-64-bit-indices

Re: [petsc-users] PCApplySymmetricRight for PCBJACOBI (fwd)

2022-09-25 Thread Barry Smith
Thanks for the bug report; your fix is correct. I have corrected it in PETSc and also added support for multiple block per MPI rank in https://gitlab.com/petsc/petsc/-/merge_requests/5678 Barry > > > -- Forwarded message

Re: [petsc-users] PETSc with 64 bit indices and MKL Sparse BLAS fails to build

2022-09-25 Thread Bro H
Sorry, forgot to send a copy to petsc-users mailing list when I first replied. My first reply is below. On Sun, Sep 25, 2022 at 11:37 AM Bro H wrote: > > Barry, thank you for answering. I did some further testing. My MKL > version is 20220002 as detected by PETSc. I tried to compile one of > the

Re: [petsc-users] PETSc with 64 bit indices and MKL Sparse BLAS fails to build

2022-09-25 Thread Barry Smith
Likely you can fix the problem by adding #if defined(PETSC_HAVE_MKL_INTEL_ILP64) #define MKL_ILP64 #endif before the #include in src/mat/impls/aij/seq/aijmkl/aijmkl.c and src/mat/impls/baij/seq/baijmkl/baijmkl.c Please let us know if this resolves the problem. Barry > On Sep 25,

Re: [petsc-users] Solve Linear System with Field Split Preconditioner

2022-09-25 Thread 晓峰 何
If assigned a preconditioner to A11 with this cmd options: -fieldsplit_0_ksp_type gmres -fieldsplit_0_pc_type ilu -fieldsplit_1_ksp_type gmres -fieldsplit_1_pc_type ilu Then I got this error: "Could not locate a solver type for factorization type ILU and matrix type schurcomplement" How

Re: [petsc-users] C++ error! MPI_Finalize() could not be located!

2022-09-25 Thread Laryssa Abdala
Amazing. Thank you so much for your clear explanation, Barry! Laryssa On Sun, Sep 25, 2022 at 5:06 PM Barry Smith wrote: > >It appears you want to use MPI (if not pass --with-mpi=0 also). > >Thus you must either > > 1) have the MPI compiler wrappers in your path (mpicc, mpicxx, >

Re: [petsc-users] PCApplySymmetricRight for PCBJACOBI (fwd)

2022-09-25 Thread Abylay Zhumekenov
Great, thanks! On Sun, 25 Sep 2022, 17:55 Barry Smith, wrote: > > Thanks for the bug report; your fix is correct. I have corrected it in > PETSc and also added support for multiple block per MPI rank in > https://gitlab.com/petsc/petsc/-/merge_requests/5678 > > Barry > > > > > --

Re: [petsc-users] C++ error! MPI_Finalize() could not be located!

2022-09-25 Thread Barry Smith
It appears you want to use MPI (if not pass --with-mpi=0 also). Thus you must either 1) have the MPI compiler wrappers in your path (mpicc, mpicxx, mpif90) or use --with-mpi-dir=somedirectory where MPI is installed and do NOT provide the compiler names (since MPI provides compiler

Re: [petsc-users] Solve Linear System with Field Split Preconditioner

2022-09-25 Thread Jed Brown
The usual issue is that you need a preconditioner for the Schur complement S = A11 - A01 A00^{-1} A10. For incompressible elasticity, this S is spectrally equivalent to a scaled mass matrix. 晓峰 何 writes: > Hi all, > > I have a linear system formed from structural mechanics, and there exists

[petsc-users] Solve Linear System with Field Split Preconditioner

2022-09-25 Thread 晓峰 何
Hi all, I have a linear system formed from structural mechanics, and there exists zero in the diagonal entries: A = (A00 A01 A10 A11), where A00 has inverse and the diagonal entries in A11 are all zero. The GMRES method with ILU preconditioner in PETSc was carried out to solve this