Sorry, I should say the relevant elements are the same as the locally 
owned.  I am pretty sure I don't need to do any communications in building 
my matrices.  Yes there will always be ghost elements with multiple MPI 
processes. 

I think I need to look more at the IndexSet and make sure I am using it 
properly.  I initially create the space for the entire global set and then 
I add_range(local_matrix_start, local_matrix_end).  Then use this IndexSet 
to distribute the sparsity pattern. 

Cheers,
Zachary

On Monday, December 21, 2020 at 11:05:39 AM UTC-6 Wolfgang Bangerth wrote:

> On 12/21/20 9:57 AM, Zachary Streeter wrote:
> > 
> > Thank you for your suggestion, this does seem helpful for me to tailor 
> things 
> > for my needs.  One clarification, is locally_relevent = locally_owned if 
> there 
> > are no ghost elements?
>
> Are there not always ghost elements if you have more than one MPI process?
>
> Best
> W.
>
> -- 
> ------------------------------------------------------------------------
> Wolfgang Bangerth email: [email protected]
> www: http://www.math.colostate.edu/~bangerth/
>
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/40e1e6e9-2f52-476b-a218-8bf2adf22defn%40googlegroups.com.

Reply via email to