I also recently noticed that mumps sometimes segfaults (or return NaNs) in
parallel on matrices coming from higher order fem (relatively dense
blocks). The fault goes away if not using scalapack to solve for the root
mode of the nested dissection tree. The same happens with mumps from a
package manager instead of --download-mumps. Probably this is a regression
on their side .

Il Sab 6 Ott 2018, 04:22 David Knezevic <david.kneze...@akselos.com> ha
scritto:

>
> On Fri, Oct 5, 2018 at 9:16 PM Mike Wick <michael.wick.1...@gmail.com>
> wrote:
>
>> Hi Hong:
>>
>> There is no explicit error message actually. The solver converge in 0
>> iteration and returns an nan number. Looks like a zero pivoting problem.
>>
>> Mike
>>
>
> I use MatMumpsGetInfo
> <https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatMumpsGetInfo.html>
>  to
> get error info returned by MUMPS, e.g.
>
> PetscInt info_1;
> MatMumpsGetInfo(pc_mat, 1, &info_1);
>
> Then you can check the value of info_1 to get error diagnostics. One error
> that I run into sometimes is when info_1 is -9, in which case I increase
> icntl_14 and try again.
>
> Best,
> David
>
>
>
>>
>> On Fri, Oct 5, 2018 at 6:12 PM Zhang, Hong <hzh...@mcs.anl.gov> wrote:
>>
>>> Mike:
>>>
>>>> Hello PETSc team:
>>>>
>>>> I am trying to solve a PDE problem with high-order finite elements. The
>>>> matrix is getting denser and my experience is that MUMPS just outperforms
>>>> iterative solvers.
>>>>
>>>> For certain problems, MUMPS just fail in the middle for no clear
>>>> reason. I just wander if there is any suggestion to improve the robustness
>>>> of MUMPS? Or in general, any suggestion for interative solver with very
>>>> high-order finite elements?
>>>>
>>>
>>> What error message do you get when MUMPS fails? Out of memory, zero
>>> pivoting, or something?
>>>  Hong
>>>
>>

Reply via email to