Re: [deal.II] Re: Initialization of PETSc::MPI::BlockSparseMatrix

2020-09-30 Thread Gabriel Stankiewicz
Dear Prof. Bangerth, Regarding the locally_owned_dofs_per_component() function: Thank you for the detailed explanation, this clarifies the problem. Now I have a better understanding of how a sequential triangulation is handled when partitioned and used for problems ran on multiple processors.

Re: [deal.II] Re: Initialization of PETSc::MPI::BlockSparseMatrix

2020-09-29 Thread Wolfgang Bangerth
Gabriel, 1. Regarding the initialization of PETSc::MPI::BlockSparseMatrix: I have used the the IndexSet::split_by_block() function and this indeed works good. Thanks for the suggestion! Unfortunately, I have encountered another issue. The PETSc::MPI::BlockSparseMatrix must be partitioned

Re: [deal.II] Re: Initialization of PETSc::MPI::BlockSparseMatrix

2020-09-29 Thread Gabriel Stankiewicz
Thank you for your input! 1. Regarding the initialization of PETSc::MPI::BlockSparseMatrix: I have used the the IndexSet::split_by_block() function and this indeed works good. Thanks for the suggestion! Unfortunately, I have encountered another issue. The PETSc::MPI::BlockSparseMatrix must be

Re: [deal.II] Re: Initialization of PETSc::MPI::BlockSparseMatrix

2020-09-29 Thread Gabriel Stankiewicz
Dear Timo, Dear Wolfgang, Thank you both for your input! 1. Regarding the initialization of PETSc::MPI::BlockSparseMatrix: I have used the the IndexSet::split_by_block() function and this indeed works good. Thanks for the suggestion! Unfortunately, I have encountered another issue. The

Re: [deal.II] Re: Initialization of PETSc::MPI::BlockSparseMatrix

2020-09-17 Thread Wolfgang Bangerth
On 9/17/20 9:07 AM, Gabriel Stankiewicz wrote: I created IndexSets by using the function DoFTools::locally_owned_dofs_per_component() and then gathering all indices corresponding to displacement DoFs (three instances of IndexSet) into one IndexSet using IndexSet::add_indices() and the fourth

Re: [deal.II] Re: Initialization of PETSc::MPI::BlockSparseMatrix

2020-09-17 Thread Gabriel Stankiewicz
I created IndexSets by using the function DoFTools::locally_owned_dofs_per_component() and then gathering all indices corresponding to displacement DoFs (three instances of IndexSet) into one IndexSet using IndexSet::add_indices() and the fourth instance correponded to electric potential DoFs.

Re: [deal.II] Re: Initialization of PETSc::MPI::BlockSparseMatrix

2020-09-16 Thread Timo Heister
> This results in failed assertion, since: rows[0].size() = 2145, > rows[1].size() = 2145, but bdsp.block(0,0).n_rows() = 1815, > bdsp.block(1,1).n_rows() = 330 etc... The size() of each IndexSet needs to correspond to the size of each block of the sparsity pattern and this is what the

[deal.II] Re: Initialization of PETSc::MPI::BlockSparseMatrix

2020-09-15 Thread Gabriel Stankiewicz
Hi Timo, thanks for replying. I think I didn't describe my problem correctly. The 4 blocks have sizes: 1815 x 1815, 1815 x 330, 330 x 1815, 330 x 330. Such a division results from having 1815 displacement DoFs and 330 electric potential DoFs in the eigenvalue problem. Only the block 1815 x

[deal.II] Re: Initialization of PETSc::MPI::BlockSparseMatrix

2020-09-14 Thread Timo Heister
Hi, I don't quite understand your reasoning for a block with 0 DoFs. Try removing any rows/columns from your block sparsity pattern and from your IndexSet's that have size 0. Notice that the number of blocks is independent of the number of components in your FEM. Can you post the values for