Hi all

I want to use Petsc to solve some linear systems via the built-in Krylov
subspace methods as well as by means of UMFPACK.

The considered matrix is block sparse with blocks of size 6x6.

Here is what I came up with after taking a look at some of the examples

MPI_Comm comm;
Mat A;
PetscInt n = 10000; /* dimension of matrix */
comm = PETSC_COMM_SELF;
MatCreate(comm,&A);
MatSetSizes(Amat,n,n,n,n);
MatSetBlockSize(A,6);
MatSetType(A,MATAIJ); /* UMFPACK compatible format due to comm =
PETSC_COMM_SELF */

Questions:
1.
I work on a single node with 2-8 cores. Hence, comm = PETSC_COMM_SELF; I
guess. Is it correct in this contect to set MatSetSizes(Amat,n,n,n,n); with
4-times n?

2.
After the above sequence of commands do I have to use something like
  MatSeqAIJSetPreallocation(A,0,d_nnz); /* d_nnz <-> number of nonzeros per
row */
or is it possible to use
  MatSeqBAIJSetPreallocation(A,6,0,db_nnz); /* db_nnz <-> number of block
nonzeros per block row */

In any case, is something like
  MatSetValuesBlocked(A,1,idx_r,1,idx_c,myblockvals,INSERT_VALUES);
to fill values of one block into the matrix A ok?


Regards
Tim

Reply via email to