Re: [petsc-users] MUMPS Error 'INFOG(1)=-3 INFO(2)=3' (SPARSE MATRIX INVERSE)

2023-07-27 Thread Pierre Jolivet
MUMPS errors are documented in section 8 of 
https://mumps-solver.org/doc/userguide_5.6.1.pdf

Thanks,
Pierre

> On 27 Jul 2023, at 3:50 PM, maitri ksh  wrote:
> 
> I am using 'MatMumpsGetInverse()' to get the inverse of a sparse matrix. I am 
> using parts of ex214.c 
> 
>  code to get the inverse, but I get an error that seems to be coming from 
> MUMPS-library. Any suggestions?
> 
> ERROR:
> [0]PETSC ERROR: - Error Message 
> --
> [0]PETSC ERROR: Error in external library
> [0]PETSC ERROR: Error reported by MUMPS in solve phase: INFOG(1)=-3 INFO(2)=3
> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.19.3, unknown
> [0]PETSC ERROR: ./MatInv_MUMPS on a arch-linux-c-debug named LAPTOP-0CP4FI1T 
> by maitri Thu Jul 27 16:35:02 2023
> [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ 
> --with-fc=gfortran --download-mpich --download-fblaslapack --with-matlab 
> --with-matlab-dir=/usr/local/MATLAB/R2022a --download-hdf5 --with-hdf5=1 
> --download-mumps --download-scalapack --download-parmetis --download-metis 
> --download-ptscotch --download-bison --download-cmake
> [0]PETSC ERROR: #1 MatMumpsGetInverse_MUMPS() at 
> /home/maitri/petsc/src/mat/impls/aij/mpi/mumps/mumps.c:2720
> [0]PETSC ERROR: #2 MatMumpsGetInverse() at 
> /home/maitri/petsc/src/mat/impls/aij/mpi/mumps/mumps.c:2753
> [0]PETSC ERROR: #3 main() at MatInv_MUMPS.c:74
> [0]PETSC ERROR: No PETSc Option Table entries
> [0]PETSC ERROR: End of Error Message ---send entire error 
> message to petsc-ma...@mcs.anl.gov--
> application called MPI_Abort(MPI_COMM_SELF, 76) - process 0
> [unset]: PMIU_write error; fd=-1 buf=:cmd=abort exitcode=76 
> message=application called MPI_Abort(MPI_COMM_SELF, 76) - process 0
> :
> system msg for write_line failure : Bad file descriptor
> 
> 
> Maitri
> 



[petsc-users] MUMPS Error 'INFOG(1)=-3 INFO(2)=3' (SPARSE MATRIX INVERSE)

2023-07-27 Thread maitri ksh
I am using 'MatMumpsGetInverse()' to get the inverse of a sparse matrix. I
am using parts of ex214.c

code
to get the inverse, but I get an error that seems to be coming from
MUMPS-library. Any suggestions?

ERROR:
[0]PETSC ERROR: - Error Message
--
[0]PETSC ERROR: Error in external library
[0]PETSC ERROR: Error reported by MUMPS in solve phase: INFOG(1)=-3
INFO(2)=3
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.19.3, unknown
[0]PETSC ERROR: ./MatInv_MUMPS on a arch-linux-c-debug named
LAPTOP-0CP4FI1T by maitri Thu Jul 27 16:35:02 2023
[0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
--with-fc=gfortran --download-mpich --download-fblaslapack --with-matlab
--with-matlab-dir=/usr/local/MATLAB/R2022a --download-hdf5 --with-hdf5=1
--download-mumps --download-scalapack --download-parmetis --download-metis
--download-ptscotch --download-bison --download-cmake
[0]PETSC ERROR: #1 MatMumpsGetInverse_MUMPS() at
/home/maitri/petsc/src/mat/impls/aij/mpi/mumps/mumps.c:2720
[0]PETSC ERROR: #2 MatMumpsGetInverse() at
/home/maitri/petsc/src/mat/impls/aij/mpi/mumps/mumps.c:2753
[0]PETSC ERROR: #3 main() at MatInv_MUMPS.c:74
[0]PETSC ERROR: No PETSc Option Table entries
[0]PETSC ERROR: End of Error Message ---send entire
error message to petsc-ma...@mcs.anl.gov--
application called MPI_Abort(MPI_COMM_SELF, 76) - process 0
[unset]: PMIU_write error; fd=-1 buf=:cmd=abort exitcode=76
message=application called MPI_Abort(MPI_COMM_SELF, 76) - process 0
:
system msg for write_line failure : Bad file descriptor


Maitri
maitri@LAPTOP-0CP4FI1T:~/my_executables$ ./MatInv_MUMPS
using LU factorization
[0]PETSC ERROR: - Error Message 
--
[0]PETSC ERROR: Error in external library
[0]PETSC ERROR: Error reported by MUMPS in solve phase: INFOG(1)=-3 INFO(2)=3
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.19.3, unknown
[0]PETSC ERROR: ./MatInv_MUMPS on a arch-linux-c-debug named LAPTOP-0CP4FI1T by 
maitri Thu Jul 27 16:35:02 2023
[0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ 
--with-fc=gfortran --download-mpich --download-fblaslapack --with-matlab 
--with-matlab-dir=/usr/local/MATLAB/R2022a --download-hdf5 --with-hdf5=1 
--download-mumps --download-scalapack --download-parmetis --download-metis 
--download-ptscotch --download-bison --download-cmake
[0]PETSC ERROR: #1 MatMumpsGetInverse_MUMPS() at 
/home/maitri/petsc/src/mat/impls/aij/mpi/mumps/mumps.c:2720
[0]PETSC ERROR: #2 MatMumpsGetInverse() at 
/home/maitri/petsc/src/mat/impls/aij/mpi/mumps/mumps.c:2753
[0]PETSC ERROR: #3 main() at MatInv_MUMPS.c:74
[0]PETSC ERROR: No PETSc Option Table entries
[0]PETSC ERROR: End of Error Message ---send entire error 
message to petsc-ma...@mcs.anl.gov--
application called MPI_Abort(MPI_COMM_SELF, 76) - process 0
[unset]: PMIU_write error; fd=-1 buf=:cmd=abort exitcode=76 message=application 
called MPI_Abort(MPI_COMM_SELF, 76) - process 0
:
system msg for write_line failure : Bad file descriptor#include 
#include 

int main(int argc,char **args)
{
  PetscErrorCode ierr;
  PetscMPIIntsize,rank;
  MatA,F,X,spRHST;
  PetscInt   m,n,nrhs,M,N,i,test;
  PetscScalarv;
  PetscReal  norm,tol=PETSC_SQRT_MACHINE_EPSILON;
  PetscRandomrand;
  PetscBool  displ=PETSC_FALSE;
 
  ierr = PetscInitialize(, , NULL, NULL);if (ierr) return ierr;
  ierr = MPI_Comm_size(PETSC_COMM_WORLD,);CHKERRQ(ierr);
  ierr = MPI_Comm_rank(PETSC_COMM_WORLD,);CHKERRQ(ierr);
  ierr = PetscOptionsGetBool(NULL,NULL,"-displ",,NULL);CHKERRQ(ierr);

  /* Load matrix A */
  PetscViewer viewerA;
  ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD, "A1.petsc", FILE_MODE_READ, 
);CHKERRQ(ierr);
  ierr = MatCreate(PETSC_COMM_WORLD, );CHKERRQ(ierr);
  ierr = MatLoad(A, viewerA);CHKERRQ(ierr);
  ierr = PetscViewerDestroy();CHKERRQ(ierr);
  ierr = MatGetLocalSize(A,,);CHKERRQ(ierr);
  ierr = MatGetSize(A,,);CHKERRQ(ierr);
  if (m != n) SETERRQ(PETSC_COMM_SELF,PETSC_ERR_ARG_SIZ, "The matrix is not 
square (%d, %d)", m, n);

  /* Create dense matrix X */
  nrhs = N;
  ierr = PetscOptionsGetInt(NULL,NULL,"-nrhs",,NULL);CHKERRQ(ierr);
  ierr = MatCreate(PETSC_COMM_WORLD, );CHKERRQ(ierr);
  ierr = MatSetSizes(X, m, PETSC_DECIDE, PETSC_DECIDE, nrhs);CHKERRQ(ierr);
  ierr = MatSetType(X, MATDENSE);CHKERRQ(ierr);
  ierr = MatSetFromOptions(X);CHKERRQ(ierr);
  ierr = MatSetUp(X);CHKERRQ(ierr);
  ierr = PetscRandomCreate(PETSC_COMM_WORLD,);CHKERRQ(ierr);
  ierr = PetscRandomSetFromOptions(rand);CHKERRQ(ierr);
  ierr = MatSetRandom(X,rand);CHKERRQ(ierr);

  // factorise 'A' using LU Factorization 

Re: [petsc-users] [petsc-maint] Monolithic AMG with fieldsplit as smoother

2023-07-27 Thread Mark Adams
I would not worry about the null space (if you have elasticity or the
equivalent use hypre for now) and the strength of connections, is not very
useful in my experience (it is confounded my high order and no one has
bothered to deploy a fancy strength of connections method in a library that
I know of). If you have anisotropies or material discontinuities, honestly
AMG does not do as well as advertised. That said, we could talk after you
get up and running.

If your problems are very hard then as Matt said, old fashioned geometric
MG using modern unstructured (FE) discretizations and mesh management, is
something to consider. PETSc has support for this and we are actively using
and developing support for this. Antony Jameson has been doing this for
decades here is an example of a new project doing something like this:
https://arxiv.org/abs/2307.04528. Tobin Isaac, in PETSc, and many others
have done things like this, but they tend to be customized for an
application whereas AMG strives to be general.

Mark

On Thu, Jul 27, 2023 at 1:10 AM Matthew Knepley  wrote:

> On Thu, Jul 27, 2023 at 12:48 AM Jed Brown  wrote:
>
>> AMG is subtle here. With AMG for systems, you typically feed it elements
>> of the near null space. In the case of (smoothed) aggregation, the coarse
>> space will have a regular block structure with block sizes equal to the
>> number of near-null vectors. You can use pc_fieldsplit options to select
>> which fields you want in each split.
>>
>> However, AMG also needs a strength of connection and if your system is so
>> weird you need to fieldsplit the smoothers (e.g., a saddle point problem or
>> a hyperbolic system) then it's likely that you'll also need a custom
>> strength of connection to obtain reasonable coarsening.
>>
>
> For this reason, sometimes GMG is easier for systems since you just
> rediscretize.
>
>   Thanks,
>
>  Matt
>
>
>> Barry Smith  writes:
>>
>> >   See the very end of the section
>> https://petsc.org/release/manual/ksp/#multigrid-preconditioners on how
>> to control the smoothers (and coarse grid solve) for multigrid in PETSc
>> including for algebraic multigrid.
>> >
>> >So, for example, -mg_levels_pc_type fieldsplit would be the starting
>> point. Depending on the block size of the matrices it may automatically do
>> simple splits, you can control the details  of the fieldsplit
>> preconditioner with -mg_levels_pc_fieldsplit_...  and the details for each
>> split with -mg_levels_fieldsplit_
>> >
>> >See src/ksp/ksp/tutorials/ex42.c for example, usage
>> >
>> >Feel free to ask more specific questions once you get started.
>> >
>> >> On Jul 26, 2023, at 9:47 PM, Michael Wick 
>> wrote:
>> >>
>> >> Hello PETSc team:
>> >>
>> >> I wonder if the current PETSc implementation supports using AMG
>> monolithically for a multi-field problem and using fieldsplit in the
>> smoother.
>> >>
>> >> Thank you very much,
>> >>
>> >> Mike
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> 
>