[deal.II] Separated domains in load balanding

2020-09-14 Thread shahab.g...@gmail.com
Dear all,
I am using load balancing and I noticed after load balancing, that the 
cells owned by each processor are sometimes separated from each other. In 
other words, some processors may own cell domains that are not connected to 
each other.
As this increases the computational cost in my case, I was wondering 
whether it would be possible to limit the load balancing to define only 
adjacent cells?
Thank you for your helps in advance.
Best regards,
Shahab

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/7b941286-61ee-436d-92b1-8bf032026128n%40googlegroups.com.


[deal.II] Re: Initialization of PETSc::MPI::BlockSparseMatrix

2020-09-14 Thread Timo Heister
Hi,

I don't quite understand your reasoning for a block with 0 DoFs. Try 
removing any rows/columns from your block sparsity pattern and from your 
IndexSet's that have size 0. Notice that the number of blocks is 
independent of the number of components in your FEM.

Can you post the values for rows[r].size() and bdsp.block(r,c).n_rows() for 
all rows r? 

Best,
Timo


On Monday, September 14, 2020 at 6:42:03 AM UTC-4 stankie...@gmail.com 
wrote:

>
> Dear deal.II  Users,
>
> I am trying to setup an eigenvalue problem of a piezoelectric bimorph 
> cantilever, in which I need to use a distributed block sparse matrix. My 
> problem contains displacement and electric potential degrees of freedom. 
> Because of that, I get 0 entries in the mass matrix that correspond to the 
> potential DoFs. My approach is to condense the stiffness and mass matrices 
> so that I can get rid of the potential DoFs in the system. For that, I 
> decided to use PETSc::MPI::BlockSparseMatrix. 
>
> When I try to initialize the PETSc::MPI::BlockSparseMatrix, I pass an 
> instance of std::vector that contains the locally owned DoFs for 
> the displacement block and the electric potential block. However, the 
> reinit() function:
>
> void BlockSparseMatrix 
> <
>  
> number >::reinit 
> ( 
> const std::vector< IndexSet 
>  > &  
> *rows*, 
> const std::vector< IndexSet 
>  > &  
> *cols*, 
> const BlockDynamicSparsityPattern 
> 
>  
> &  
> *bdsp*, 
> const MPI_Comm &  
> *com*  ) 
>
> fails to pass the assertion that compares the individual block sizes and 
> the sizes of the IndexSets. Since the size of an IndexSet needs to 
> correspond to the total number of DoFs in the system (otherwise I will not 
> be able to store some of the DoF indices), but the block sizes are 
> naturally smaller than the total number of DoFs, this assertion cannot be 
> passed. 
>
> The violated condition was: 
> rows[r].size() == bdsp.block(r, c).n_rows()
> Additional information: 
> invalid size
>
> Could anyone tell me if my understanding is correct or if there is a 
> solution to my problem?
>
> Best regards,
> Gabriel Stankiewicz
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/a8b95112-f1d2-4571-ade9-80147e445f75n%40googlegroups.com.


[deal.II] Initialization of PETSc::MPI::BlockSparseMatrix

2020-09-14 Thread Gabriel Stankiewicz

Dear deal.II  Users,

I am trying to setup an eigenvalue problem of a piezoelectric bimorph 
cantilever, in which I need to use a distributed block sparse matrix. My 
problem contains displacement and electric potential degrees of freedom. 
Because of that, I get 0 entries in the mass matrix that correspond to the 
potential DoFs. My approach is to condense the stiffness and mass matrices 
so that I can get rid of the potential DoFs in the system. For that, I 
decided to use PETSc::MPI::BlockSparseMatrix. 

When I try to initialize the PETSc::MPI::BlockSparseMatrix, I pass an 
instance of std::vector that contains the locally owned DoFs for 
the displacement block and the electric potential block. However, the 
reinit() function:

void BlockSparseMatrix 
<
 
number >::reinit 
( 
const std::vector< IndexSet 
 > &  
*rows*, 
const std::vector< IndexSet 
 > &  
*cols*, 
const BlockDynamicSparsityPattern 

 
&  
*bdsp*, 
const MPI_Comm &  
*com*  ) 

fails to pass the assertion that compares the individual block sizes and 
the sizes of the IndexSets. Since the size of an IndexSet needs to 
correspond to the total number of DoFs in the system (otherwise I will not 
be able to store some of the DoF indices), but the block sizes are 
naturally smaller than the total number of DoFs, this assertion cannot be 
passed. 

The violated condition was: 
rows[r].size() == bdsp.block(r, c).n_rows()
Additional information: 
invalid size

Could anyone tell me if my understanding is correct or if there is a 
solution to my problem?

Best regards,
Gabriel Stankiewicz

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/e74b3911-c714-414b-861f-bcb7927d0442n%40googlegroups.com.