Based on the output below it appears that the code block you sent was 
completely successfully since it printed all three matrices you tried to 
proint. So it is something else in your code that is generating the zero pivot. 

  BTW:  it should not be 

MatView(invM2T, PETSC_VIEWER_STDOUT_WORLD);   

it should be 

MatView(seqM2T, PETSC_VIEWER_STDOUT_WORLD); 

to display the inverse

And how is it possible that the error occurred on MPI rank 1 but the matrices 
are sequential matrices viewed on PETSC_VIEWER_STDOUT_WORLD? 

A zero privet means that either the matrix is singular or the ordering used to 
factor the matrix is not a good ordering.  Perhaps the matrix was generated 
incorrectly.

The code crashed because likely your code returned an error code returned from 
PETSc and tried to keep running. For example if the factorization generated an 
error code but you called the MatSolveSolve on the result anyway, you cannot do 
that. 







On Sep 27, 2013, at 7:21 AM, Joon Hee Choi <[email protected]> wrote:

> Hi.
> I am trying to get the inverse matrix of seqaij. I followed FAQ of petsc 
> manual, but I got error.
> Could someone tell me what I am missing? I am attaching my code, results, and 
> error message.
> Thank you.
> 
> Joon
> 
> 
> 
>   PetscPrintf(PETSC_COMM_WORLD, "I \n");
>   MatView(I, PETSC_VIEWER_STDOUT_WORLD);
>   PetscPrintf(PETSC_COMM_WORLD, "invM2T \n");
>   MatView(invM2T, PETSC_VIEWER_STDOUT_WORLD);
> 
>   ierr = MatCreateSeqDense(PETSC_COMM_SELF, R, R, NULL, &seqM2T); 
> CHKERRQ(ierr);
>   ierr = MatGetOrdering(invM2T, MATORDERINGNATURAL, &isr, &isc); 
> CHKERRQ(ierr);
>   ierr = MatFactorInfoInitialize(&info); CHKERRQ(ierr);
>   ierr = MatLUFactor(invM2T, isr, isc, &info); CHKERRQ(ierr);
> 
>   ierr = MatMatSolve(invM2T, I, seqM2T); CHKERRQ(ierr);
>   PetscPrintf(PETSC_COMM_WORLD, "invM2T \n");
>   MatView(invM2T, PETSC_VIEWER_STDOUT_WORLD);
> 
> 
> Matrix Object: 1 MPI processes
>  type: seqdense
> 1.0000000000000000e+00 0.0000000000000000e+00 
> 0.0000000000000000e+00 1.0000000000000000e+00 
> invM2T 
> Matrix Object: 1 MPI processes
>  type: seqaij
> row 0: (0, 3.22689)  (1, 2.47603) 
> row 1: (0, 2.47603)  (1, 2.25324) 
> invM2T 
> Matrix Object: 1 MPI processes
>  type: seqaij
> row 0: (0, 3.22689)  (1, 2.47603) 
> row 1: (0, 0.76731)  (1, 0.353361) 
> 
> 
> [1]PETSC ERROR: --------------------- Error Message 
> ------------------------------------
> [1]PETSC ERROR: Detected zero pivot in LU factorization:
> see http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot!
> [1]PETSC ERROR: Zero pivot row 0 value 0 tolerance 0!
> [1]PETSC ERROR: 
> ------------------------------------------------------------------------
> [1]PETSC ERROR: Petsc Release Version 3.4.1, Jun, 10, 2013
> [1]PETSC ERROR: See docs/changes/index.html for recent updates.
> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> [1]PETSC ERROR: See docs/index.html for manual pages.
> [1]PETSC ERROR: 
> ------------------------------------------------------------------------
> [1]PETSC ERROR: ./tenfac on a linux named rossmann-b000.rcac.purdue.edu by 
> choi240 Fri Sep 27 08:01:31 2013
> [1]PETSC ERROR: Libraries linked from 
> /apps/rhel5/petsc-3.4.1/64/impi-4.1.0.030_intel-13.0.1.117_ind64/linux/lib
> [1]PETSC ERROR: Configure run at Sun Jun 23 10:20:51 2013
> [1]PETSC ERROR: Configure options --with-cc=mpiicc --with-cxx=mpiicpc 
> --with-fc=mpiifort --download-sowing --with-scalar-type=real 
> --with-shared-libraries=1 --with-pic=1 --with-clanguage=C++ --with-fortran 
> --with-fortran-kernels=1 --with-64-bit-indices=1 --with-debugging=0 
> --with-blas-lapack-dir=/opt/intel/composer_xe_2013.1.117/mkl/lib/intel64 
> --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --download-hdf5=no 
> --download-metis=no --download-parmetis=no --download-superlu_dist=no 
> --download-mumps=no --download-scalapack=yes --download-blacs=no 
> --download-hypre=no --download-spooles=no
> [1]PETSC ERROR: 
> ------------------------------------------------------------------------
> [1]PETSC ERROR: MatPivotCheck_none() line 589 in 
> /apps/rhel5/petsc-3.4.1/64/impi-4.1.0.030_intel-13.0.1.117_ind64/include/petsc-private/matimpl.h
> [1]PETSC ERROR: MatPivotCheck() line 608 in 
> /apps/rhel5/petsc-3.4.1/64/impi-4.1.0.030_intel-13.0.1.117_ind64/include/petsc-private/matimpl.h
> [1]PETSC ERROR: MatLUFactorNumeric_SeqAIJ_Inode() line 1422 in 
> /apps/rhel5/petsc-3.4.1/64/impi-4.1.0.030_intel-13.0.1.117_ind64/src/mat/impls/aij/seq/inode.c
> [1]PETSC ERROR: MatLUFactorNumeric() line 2889 in 
> /apps/rhel5/petsc-3.4.1/64/impi-4.1.0.030_intel-13.0.1.117_ind64/src/mat/interface/matrix.c
> [1]PETSC ERROR: MatLUFactor_SeqAIJ() line 971 in 
> /apps/rhel5/petsc-3.4.1/64/impi-4.1.0.030_intel-13.0.1.117_ind64/src/mat/impls/aij/seq/aijfact.c
> [1]PETSC ERROR: MatLUFactor() line 2716 in 
> /apps/rhel5/petsc-3.4.1/64/impi-4.1.0.030_intel-13.0.1.117_ind64/src/mat/interface/matrix.c
> [1]PETSC ERROR: cal_M2() line 83 in tenfac.cpp
> [1]PETSC ERROR: 
> ------------------------------------------------------------------------
> [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, 
> probably memory access out of range
> [1]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> [1]PETSC ERROR: or see 
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[1]PETSC ERROR: 
> or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory 
> corruption errors
> [1]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
> [1]PETSC ERROR: to get more information on the crash.
> [1]PETSC ERROR: --------------------- Error Message 
> ------------------------------------
> [1]PETSC ERROR: Signal received!
> [1]PETSC ERROR: 
> ------------------------------------------------------------------------
> [1]PETSC ERROR: Petsc Release Version 3.4.1, Jun, 10, 2013
> [1]PETSC ERROR: See docs/changes/index.html for recent updates.
> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> [1]PETSC ERROR: See docs/index.html for manual pages.
> [1]PETSC ERROR: 
> ------------------------------------------------------------------------
> [1]PETSC ERROR: ./tenfac on a linux named rossmann-b000.rcac.purdue.edu by 
> choi240 Fri Sep 27 08:01:31 2013
> [1]PETSC ERROR: Libraries linked from 
> /apps/rhel5/petsc-3.4.1/64/impi-4.1.0.030_intel-13.0.1.117_ind64/linux/lib
> [1]PETSC ERROR: Configure run at Sun Jun 23 10:20:51 2013
> [1]PETSC ERROR: Configure options --with-cc=mpiicc --with-cxx=mpiicpc 
> --with-fc=mpiifort --download-sowing --with-scalar-type=real 
> --with-shared-libraries=1 --with-pic=1 --with-clanguage=C++ --with-fortran 
> --with-fortran-kernels=1 --with-64-bit-indices=1 --with-debugging=0 
> --with-blas-lapack-dir=/opt/intel/composer_xe_2013.1.117/mkl/lib/intel64 
> --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --download-hdf5=no 
> --download-metis=no --download-parmetis=no --download-superlu_dist=no 
> --download-mumps=no --download-scalapack=yes --download-blacs=no 
> --download-hypre=no --download-spooles=no
> [1]PETSC ERROR: 
> ------------------------------------------------------------------------
> [1]PETSC ERROR: User provided function() line 0 in unknown directory unknown 
> file
> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 1

Reply via email to