Sure, please take a look at the modified introduction example 3
<https://github.com/bboutkov/libmesh/tree/distmesh_linearpart_fail> which
should trip said assert with -np 2.


On Thu, Nov 9, 2017 at 12:40 PM, Paul T. Bauman <ptbau...@gmail.com> wrote:

> Boris, can you please send Roy a standalone libMesh example that he can
> just compile and run and trip the assert? Thanks,
>
> On Thu, Nov 9, 2017 at 12:22 PM, Boris Boutkov <boris...@buffalo.edu>
> wrote:
>
>> Having a simple DistributedMesh partitioner to compare vs ParMETIS would
>> certainly be very useful in driving the original issue forward.
>>
>> The LinearPartitioner assert trip does appear reproducible with
>> gcc7.2/mpich3.2 when running the refinement example
>> <https://github.com/bboutkov/grins/tree/master_refine_test>from before.
>> The initial mesh can be made as small as 4x4 for easier debugging, and to
>> be totally explicit im building off libMesh master and manually attaching
>> the partitioner right before partition() in prepare_for_use().
>>
>> Thanks for the help -
>>
>> On Thu, Nov 9, 2017 at 10:50 AM, Roy Stogner <royst...@ices.utexas.edu>
>> wrote:
>>
>>>
>>> On Thu, 9 Nov 2017, Boris Boutkov wrote:
>>>
>>> Well, I eliminated PETSc and have been linking to MPI using
>>>> --with-mpi=$MPI_DIR and playing with the refinement example I had
>>>> mentioned earlier to try and eliminate ParMETIS due the the
>>>> hang/crash issue. In these configs I attach either the
>>>> LinearPartitioner or an SFC in prepare_for_use right before calling
>>>> partition(). This causes assert trips in MeshComm::Redistribute
>>>> where elem.proc_id != proc_id while unpacking elems (stack below).
>>>>
>>>
>>> Shoot - I don't think either of those partitioners have been upgraded
>>> to be compatible with DistributedMesh use.  Just glancing at
>>> LinearPartitioner, it looks like it'll do fine for an *initial*
>>> partitioning, but then it'll scramble everything if it's ever asked to
>>> do a *repartitioning* on an already-distributed mesh.
>>>
>>> I could probably fix that pretty quickly, if you've got a test case I
>>> can replicate.
>>>
>>> SFC, on the other hand, I don't know about.  We do distributed
>>> space-filling-curve stuff elsewhere in the library with libHilbert,
>>> and it's not trivial.
>>> ---
>>> Roy
>>>
>>
>>
>> ------------------------------------------------------------
>> ------------------
>> Check out the vibrant tech community on one of the world's most
>> engaging tech sites, Slashdot.org! http://sdm.link/slashdot
>> _______________________________________________
>> Libmesh-devel mailing list
>> Libmesh-devel@lists.sourceforge.net
>> https://lists.sourceforge.net/lists/listinfo/libmesh-devel
>>
>>
>
------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
Libmesh-devel mailing list
Libmesh-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/libmesh-devel

Reply via email to