HI Peter,

Thanks a lot. It seems to work. If I am understanding it correctly, having 
group size = MPI ranks, means that only proc 0 is reading the data and 
distributing it. Am I correct? 

Is it possible to verify the size of elements / memory size after the file 
is read and before the partition to actually verify that only proc 0 is 
reading and storing before partitioning?

Thanks,
Kumar Saurabh
On Tuesday, March 14, 2023 at 11:33:04 PM UTC-7 peterr...@gmail.com wrote:

> Hi Kumar,
>
> take a look at 
> https://www.dealii.org/developer/doxygen/deal.II/namespaceTriangulationDescription_1_1Utilities.html#aefc3e841bcfd37714a07d04e42c8ffca
> .
>
> Hope this helps,
> Peter
>
> On Wednesday, 15 March 2023 at 01:44:17 UTC+1 kumar.sau...@gmail.com 
> wrote:
>
>> Hi, 
>>
>> I am trying to perform the mesh partition using the mesh generated from 
>> GMSH. The generated mesh is quite big with around 500K elements. 
>>
>> I am new to deal.ii. But I get the impression that the 
>> parallel::distributed::Triangulation works only from the 
>> quadrilateral/hexahedral meshes and uses p4est backend.
>>
>> I tried GridTools::partition_triangulation which tends to repeat the 
>> elements on all processor. This is not ideal as the number of elements is 
>> too large.
>>
>> I want to ask if I can use parallel::fullydistributed::Triangulation to 
>> partition the tetrahedrals without repeating on all the processors. If so, 
>> is there any examples for the same. All the examples I saw tends to work 
>> for Hypercube type of meshes.
>>
>> Thanks,
>> Kumar Saurabh
>>
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/5c7b1459-d4d4-4c8a-b039-83c9b80f6589n%40googlegroups.com.

Reply via email to