Dear Brry,

Thank you for your fast reply. I generated a SEQAIJ matrix in MPI rank 1 from a 
MPIAIJ matrix, because I needed the inverse matrix of the MPI matrix. 
Actually, the furmula is very simple, but my code is very complex. 
What I want to get is the inverse of (temp1 * temp2). temp1 and temp2 are 
mpiaij and * is hadamard product. Also, I want to get mpiaij matrix as the 
result. I think the ordering used to factor the matrix was not good because 
seqM2 is not singular. From my code, could you let me know why the ordering is 
wrong? I am sorry, but I am attaching my very long code. Thank you again.

Joon


  MPI_Comm_size(PETSC_COMM_WORLD, &size);
  MPI_Comm_rank(MPI_COMM_SELF, &myrank);

  ierr = MatCreate(PETSC_COMM_WORLD, &invM2); CHKERRQ(ierr);
  ierr = MatSetType(invM2, MATDENSE); CHKERRQ(ierr);
  ierr = MatSetSizes(invM2, PETSC_DECIDE, PETSC_DECIDE, R, R); CHKERRQ(ierr);
  ierr = MatSeqDenseSetPreallocation(invM2, NULL); CHKERRQ(ierr);
  ierr = MatMPIDenseSetPreallocation(invM2, NULL); CHKERRQ(ierr);

  ierr = MatGetOwnershipRange(temp1, &rstart, &rend); CHKERRQ(ierr);
  ierr = PetscMalloc((rend-rstart)*R*sizeof(PetscScalar), &cell); CHKERRQ(ierr);
  for (i=rstart; i<rend; i++) {
    for (j=0; j<R; j++) {
      ierr = MatGetValues(temp1, 1, &i, 1, &j, &cell1); CHKERRQ(ierr);
      ierr = MatGetValues(temp2, 1, &i, 1, &j, &cell2); CHKERRQ(ierr);
      cell[(i-rstart)*R+j] = cell1 * cell2;
    }
  }

  rbuf = (PetscScalar *)malloc(R*R*sizeof(PetscScalar));
  PetscInt *idx;
  idx = (PetscInt *)malloc(R*sizeof(PetscInt));
  for (i=0; i<R; i++) idx[i]=i;
  MPI_Gather(cell, (rend-rstart)*R, MPIU_SCALAR, rbuf, (rend-rstart)*R, 
MPIU_SCALAR, 0, MPI_COMM_WORLD);
  if (myrank == 0) {
    ierr = MatCreateSeqAIJ(PETSC_COMM_SELF, R, R, R, NULL, &invM2); 
CHKERRQ(ierr);
    ierr = MatSetValues(invM2, R, idx, R, idx, rbuf, INSERT_VALUES); 
CHKERRQ(ierr);
    ierr = MatAssemblyBegin(invM2, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr);
    ierr = MatAssemblyEnd(invM2, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr);
    ierr = MatTranspose(invM2, MAT_INITIAL_MATRIX, &invM2T); CHKERRQ(ierr);

    ierr = MatCreateSeqDense(PETSC_COMM_SELF, R, R, NULL, &I); CHKERRQ(ierr);
    ierr = MatZeroEntries(I); CHKERRQ(ierr);
    ierr = MatShift(I, 1.0); CHKERRQ(ierr);

    ierr = MatCreateSeqDense(PETSC_COMM_SELF, R, R, NULL, &seqM2T); 
CHKERRQ(ierr);
    ierr = MatGetOrdering(invM2T, MATORDERINGNATURAL, &isr, &isc); 
CHKERRQ(ierr);
    ierr = MatFactorInfoInitialize(&info); CHKERRQ(ierr);

    ierr = MatLUFactor(invM2T, isr, isc, &info); CHKERRQ(ierr);
    PetscPrintf(PETSC_COMM_WORLD, "invM2T \n");
    MatView(invM2T, PETSC_VIEWER_STDOUT_WORLD);

    ierr = MatMatSolve(invM2T, I, seqM2T); CHKERRQ(ierr);
    PetscPrintf(PETSC_COMM_WORLD, "seqM2T \n");
    MatView(seqM2T, PETSC_VIEWER_STDOUT_WORLD);
    ierr = MatDenseGetArray(seqM2T, &rbuf); CHKERRQ(ierr);;
  }

  MPI_Scatter(rbuf, (rend-rstart)*R, MPIU_SCALAR, cell, (rend-rstart)*R, 
MPIU_SCALAR, 0, MPI_COMM_WORLD);

  M = R;
  clocm = PETSC_DECIDE;
  ierr = PetscSplitOwnership(PETSC_COMM_WORLD, &clocm, &M); CHKERRQ(ierr);
  ierr = MPI_Scan(&clocm, &cend, size, MPI_INT, MPI_SUM, PETSC_COMM_WORLD); 
CHKERRQ(ierr);
  cstart = cend - clocm;

  ierr = MatCreate(PETSC_COMM_WORLD, M2T); CHKERRQ(ierr);
  ierr = MatSetType(*M2T, MATAIJ); CHKERRQ(ierr);
  ierr = MatSetSizes(*M2T, PETSC_DECIDE, PETSC_DECIDE, R, R); CHKERRQ(ierr);
  ierr = MatSeqAIJSetPreallocation(*M2T, R, NULL); CHKERRQ(ierr);
  ierr = MatMPIAIJSetPreallocation(*M2T, clocm, NULL, R-clocm, NULL); 
CHKERRQ(ierr);

  for (i=rstart; i<rend; i++) {
    for (j=0; j<R; j++) {
      ierr = MatSetValue(*M2T, i, j, cell[(i-rstart)*R+j], INSERT_VALUES); 
CHKERRQ(ierr);
    }
  }
  ierr = MatAssemblyBegin(*M2T, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr);
  ierr = MatAssemblyEnd(*M2T, MAT_FINAL_ASSEMBLY); CHKERRQ(ierr);
}

----- 원본 메시지 -----
보낸 사람: Barry Smith <[email protected]>
받는 사람: Joon Hee Choi <[email protected]>
참조: [email protected]
보냄: Fri, 27 Sep 2013 09:05:58 -0400 (EDT)
제목: Re: [petsc-users] The inverse of seqaij matrix


>  Based on the output below it appears that the code block you sent was 
> completely successfully since it printed all three matrices you tried to 
> proint. So it is something else in your code that is generating the zero 
> pivot. 

>  BTW:  it should not be 

>MatView(invM2T, PETSC_VIEWER_STDOUT_WORLD);   

>it should be 

>MatView(seqM2T, PETSC_VIEWER_STDOUT_WORLD); 

>to display the inverse

>And how is it possible that the error occurred on MPI rank 1 but the matrices 
>are sequential matrices viewed on PETSC_VIEWER_STDOUT_WORLD? 

>A zero privet means that either the matrix is singular or the ordering used to 
>factor the matrix is not a good ordering.  Perhaps the matrix was generated 
>incorrectly.

>The code crashed because likely your code returned an error code returned from 
>PETSc and tried to keep running. For example if the factorization generated an 
>error code but you called the MatSolveSolve on the result anyway, you cannot 
>do that. 







On Sep 27, 2013, at 7:21 AM, Joon Hee Choi <[email protected]> wrote:

>> Hi.
>> I am trying to get the inverse matrix of seqaij. I followed FAQ of petsc 
>> manual, but I got error.
>> Could someone tell me what I am missing? I am attaching my code, results, 
>> and error message.
>> Thank you.
>> 
>> Joon
>> 
>> 
>> 
>>   PetscPrintf(PETSC_COMM_WORLD, "I \n");
>>   MatView(I, PETSC_VIEWER_STDOUT_WORLD);
>>   PetscPrintf(PETSC_COMM_WORLD, "invM2T \n");
>>   MatView(invM2T, PETSC_VIEWER_STDOUT_WORLD);
>> 
>>   ierr = MatCreateSeqDense(PETSC_COMM_SELF, R, R, NULL, &seqM2T); 
>> CHKERRQ(ierr);
>>   ierr = MatGetOrdering(invM2T, MATORDERINGNATURAL, &isr, &isc); 
>> CHKERRQ(ierr);
>>   ierr = MatFactorInfoInitialize(&info); CHKERRQ(ierr);
>>   ierr = MatLUFactor(invM2T, isr, isc, &info); CHKERRQ(ierr);
>> 
>>   ierr = MatMatSolve(invM2T, I, seqM2T); CHKERRQ(ierr);
>>   PetscPrintf(PETSC_COMM_WORLD, "invM2T \n");
>>   MatView(invM2T, PETSC_VIEWER_STDOUT_WORLD);
>> 
>> 
>> Matrix Object: 1 MPI processes
>>  type: seqdense
>> 1.0000000000000000e+00 0.0000000000000000e+00 
>> 0.0000000000000000e+00 1.0000000000000000e+00 
>> invM2T 
>> Matrix Object: 1 MPI processes
>>  type: seqaij
>> row 0: (0, 3.22689)  (1, 2.47603) 
>> row 1: (0, 2.47603)  (1, 2.25324) 
>> invM2T 
>> Matrix Object: 1 MPI processes
>>  type: seqaij
>> row 0: (0, 3.22689)  (1, 2.47603) 
>> row 1: (0, 0.76731)  (1, 0.353361) 
>> 
>> 
>> [1]PETSC ERROR: --------------------- Error Message 
>> ------------------------------------
>> [1]PETSC ERROR: Detected zero pivot in LU factorization:
>> see http://www.mcs.anl.gov/petsc/documentation/faq.html#ZeroPivot!
>> [1]PETSC ERROR: Zero pivot row 0 value 0 tolerance 0!
>> [1]PETSC ERROR: 
>> ------------------------------------------------------------------------
>> [1]PETSC ERROR: Petsc Release Version 3.4.1, Jun, 10, 2013
>> [1]PETSC ERROR: See docs/changes/index.html for recent updates.
>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>> [1]PETSC ERROR: See docs/index.html for manual pages.
>> [1]PETSC ERROR: 
>> ------------------------------------------------------------------------
>> [1]PETSC ERROR: ./tenfac on a linux named rossmann-b000.rcac.purdue.edu by 
>> choi240 Fri Sep 27 08:01:31 2013
>> [1]PETSC ERROR: Libraries linked from 
>> /apps/rhel5/petsc-3.4.1/64/impi-4.1.0.030_intel-13.0.1.117_ind64/linux/lib
>> [1]PETSC ERROR: Configure run at Sun Jun 23 10:20:51 2013
>> [1]PETSC ERROR: Configure options --with-cc=mpiicc --with-cxx=mpiicpc 
>> --with-fc=mpiifort --download-sowing --with-scalar-type=real 
>> --with-shared-libraries=1 --with-pic=1 --with-clanguage=C++ --with-fortran 
>> --with-fortran-kernels=1 --with-64-bit-indices=1 --with-debugging=0 
>> --with-blas-lapack-dir=/opt/intel/composer_xe_2013.1.117/mkl/lib/intel64 
>> --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --download-hdf5=no 
>> --download-metis=no --download-parmetis=no --download-superlu_dist=no 
>> --download-mumps=no --download-scalapack=yes --download-blacs=no 
>> --download-hypre=no --download-spooles=no
>> [1]PETSC ERROR: 
>> ------------------------------------------------------------------------
>> [1]PETSC ERROR: MatPivotCheck_none() line 589 in 
>> /apps/rhel5/petsc-3.4.1/64/impi-4.1.0.030_intel-13.0.1.117_ind64/include/petsc-private/matimpl.h
>> [1]PETSC ERROR: MatPivotCheck() line 608 in 
>> /apps/rhel5/petsc-3.4.1/64/impi-4.1.0.030_intel-13.0.1.117_ind64/include/petsc-private/matimpl.h
>> [1]PETSC ERROR: MatLUFactorNumeric_SeqAIJ_Inode() line 1422 in 
>> /apps/rhel5/petsc-3.4.1/64/impi-4.1.0.030_intel-13.0.1.117_ind64/src/mat/impls/aij/seq/inode.c
>> [1]PETSC ERROR: MatLUFactorNumeric() line 2889 in 
>> /apps/rhel5/petsc-3.4.1/64/impi-4.1.0.030_intel-13.0.1.117_ind64/src/mat/interface/matrix.c
>> [1]PETSC ERROR: MatLUFactor_SeqAIJ() line 971 in 
>> /apps/rhel5/petsc-3.4.1/64/impi-4.1.0.030_intel-13.0.1.117_ind64/src/mat/impls/aij/seq/aijfact.c
>> [1]PETSC ERROR: MatLUFactor() line 2716 in 
>> /apps/rhel5/petsc-3.4.1/64/impi-4.1.0.030_intel-13.0.1.117_ind64/src/mat/interface/matrix.c
>> [1]PETSC ERROR: cal_M2() line 83 in tenfac.cpp
>> [1]PETSC ERROR: 
>> ------------------------------------------------------------------------
>> [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, 
>> probably memory access out of range
>> [1]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
>> [1]PETSC ERROR: or see 
>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[1]PETSC ERROR: 
>> or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory 
>> corruption errors
>> [1]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and 
>> run
>> [1]PETSC ERROR: to get more information on the crash.
>> [1]PETSC ERROR: --------------------- Error Message 
>> ------------------------------------
>> [1]PETSC ERROR: Signal received!
>> [1]PETSC ERROR: 
>> ------------------------------------------------------------------------
>> [1]PETSC ERROR: Petsc Release Version 3.4.1, Jun, 10, 2013
>> [1]PETSC ERROR: See docs/changes/index.html for recent updates.
>> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>> [1]PETSC ERROR: See docs/index.html for manual pages.
>> [1]PETSC ERROR: 
>> ------------------------------------------------------------------------
>> [1]PETSC ERROR: ./tenfac on a linux named rossmann-b000.rcac.purdue.edu by 
>> choi240 Fri Sep 27 08:01:31 2013
>> [1]PETSC ERROR: Libraries linked from 
>> /apps/rhel5/petsc-3.4.1/64/impi-4.1.0.030_intel-13.0.1.117_ind64/linux/lib
>> [1]PETSC ERROR: Configure run at Sun Jun 23 10:20:51 2013
>> [1]PETSC ERROR: Configure options --with-cc=mpiicc --with-cxx=mpiicpc 
>> --with-fc=mpiifort --download-sowing --with-scalar-type=real 
>> --with-shared-libraries=1 --with-pic=1 --with-clanguage=C++ --with-fortran 
>> --with-fortran-kernels=1 --with-64-bit-indices=1 --with-debugging=0 
>> --with-blas-lapack-dir=/opt/intel/composer_xe_2013.1.117/mkl/lib/intel64 
>> --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --download-hdf5=no 
>> --download-metis=no --download-parmetis=no --download-superlu_dist=no 
>> --download-mumps=no --download-scalapack=yes --download-blacs=no 
>> --download-hypre=no --download-spooles=no
>> [1]PETSC ERROR: 
>> ------------------------------------------------------------------------
>> [1]PETSC ERROR: User provided function() line 0 in unknown directory unknown 
>> file
>> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 1


Reply via email to