Re: [petsc-users] Failure of MUMPS

2018-10-09 Thread Zhang, Junchao
OK, I found -ksp_error_if_not_converged will trigger PETSc to fail in this case.

--Junchao Zhang


On Tue, Oct 9, 2018 at 3:38 PM Junchao Zhang 
mailto:jczh...@mcs.anl.gov>> wrote:
I met a case where MUMPS returned an out-of-memory code but PETSc continued to 
run.  When PETSc calls MUMPS, it checks if (A->erroriffailure). I added 
-mat_error_if_failure, but it did not work since it was overwritten by 
MatSetErrorIfFailure(pc->pmat,pc->erroriffailure)
Does it suggest we should add a new option -pc_factor_error_if_failure and 
check it in PCSetFromOptions_Factor()?

--Junchao Zhang

On Fri, Oct 5, 2018 at 8:12 PM Zhang, Hong 
mailto:hzh...@mcs.anl.gov>> wrote:
Mike:
Hello PETSc team:

I am trying to solve a PDE problem with high-order finite elements. The matrix 
is getting denser and my experience is that MUMPS just outperforms iterative 
solvers.

For certain problems, MUMPS just fail in the middle for no clear reason. I just 
wander if there is any suggestion to improve the robustness of MUMPS? Or in 
general, any suggestion for interative solver with very high-order finite 
elements?

What error message do you get when MUMPS fails? Out of memory, zero pivoting, 
or something?
 Hong


Re: [petsc-users] Failure of MUMPS

2018-10-09 Thread Zhang, Junchao
I met a case where MUMPS returned an out-of-memory code but PETSc continued to 
run.  When PETSc calls MUMPS, it checks if (A->erroriffailure). I added 
-mat_error_if_failure, but it did not work since it was overwritten by 
MatSetErrorIfFailure(pc->pmat,pc->erroriffailure)
Does it suggest we should add a new option -pc_factor_error_if_failure and 
check it in PCSetFromOptions_Factor()?

--Junchao Zhang

On Fri, Oct 5, 2018 at 8:12 PM Zhang, Hong 
mailto:hzh...@mcs.anl.gov>> wrote:
Mike:
Hello PETSc team:

I am trying to solve a PDE problem with high-order finite elements. The matrix 
is getting denser and my experience is that MUMPS just outperforms iterative 
solvers.

For certain problems, MUMPS just fail in the middle for no clear reason. I just 
wander if there is any suggestion to improve the robustness of MUMPS? Or in 
general, any suggestion for interative solver with very high-order finite 
elements?

What error message do you get when MUMPS fails? Out of memory, zero pivoting, 
or something?
 Hong


Re: [petsc-users] Problems about Block Jocabi and Matrix-Free Method in SNES

2018-10-09 Thread Matthew Knepley
On Mon, Oct 8, 2018 at 11:33 PM Yingjie Wu  wrote:

> Dear Petsc developer:
> Hi,
>
> I've been studying Petsc recently about Precontioner and Metrix-Free, and
> I have some questions that puzzle me.
>
> 1. I want to test block Jacobi preconditioner, so I chose
> /snes/example/tutorial/ex3.c as an example. According to the reference in
> the example, the input parameters are:
> mpiexec -n 8./ex3 -nox -n 1 -ksp_type fgmres -pc_type bjacobi
> -pc_bjacobi_blocks 4 -sub_ksp_type gmres -sub_ksp_max_it 3 -post_setsubksp
> -sub_ksp_rtol 1.e-16
>

You do not care about recursive blocks, so just use

  $MPIEXEC -n 8 ./ex3 -nox -n 1 -ksp_type fgmres -pc_type bjacobi
-pc_bjacobi_blocks 4 -sub_ksp_type gmres -sub_ksp_max_it 3 -snes_view
-ksp_view

and I get the attached output.


> I want to export each block of KSP and PC information :
>   -snes_view -ksp_view
> However, the procedure is wrong and the wrong information is as follows:
>
>[0]PETSC ERROR: - Error Message
> --
> [0]PETSC ERROR: Arguments must have same communicators
> [0]PETSC ERROR: Different communicators in the two objects: Argument # 1
> and 2 flag 3
> [0]PETSC ERROR: [1]PETSC ERROR: - Error Message
> --
> [2]PETSC ERROR: - Error Message
> --
> [2]PETSC ERROR: [3]PETSC ERROR: - Error Message
> --
> [3]PETSC ERROR: Arguments must have same communicators
> [3]PETSC ERROR: [6]PETSC ERROR: - Error Message
> --
> [6]PETSC ERROR: Arguments must have same communicators
> [6]PETSC ERROR: Different communicators in the two objects: Argument # 1
> and 2 flag 3
> [6]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [7]PETSC ERROR: - Error Message
> --
> [7]PETSC ERROR: Arguments must have same communicators
> [7]PETSC ERROR: Different communicators in the two objects: Argument # 1
> and 2 flag 3
> [7]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [7]PETSC ERROR: Petsc Release Version 3.10.1, Sep, 26, 2018
> See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble
> shooting.
> [0]PETSC ERROR: Petsc Release Version 3.10.1, Sep, 26, 2018
> [0]PETSC ERROR: ./ex3 on a arch-linux2-c-debug named yjwu-XPS-8910 by yjwu
> Mon Oct  8 22:35:34 2018
> [0]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
> --with-fc=gfortran --download-mpich --download-fblaslapack
> [0]PETSC ERROR: #1 KSPView() line 118 in
> /home/yjwu/petsc-3.10.1/src/ksp/ksp/interface/itcreate.c
> [0]PETSC ERROR: #2 PCView_BJacobi() line 232 in
> /home/yjwu/petsc-3.10.1/src/ksp/pc/impls/bjacobi/bjacobi.c
> [0]PETSC ERROR: #3 PCView() line 1651 in
> /home/yjwu/petsc-3.10.1/src/ksp/pc/interface/precon.c
> [1]PETSC ERROR: Arguments must have same communicators
> [1]PETSC ERROR: Different communicators in the two objects: Argument # 1
> and 2 flag 3
> [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [1]PETSC ERROR: Petsc Release Version 3.10.1, Sep, 26, 2018
> [1]PETSC ERROR: ./ex3 on a arch-linux2-c-debug named yjwu-XPS-8910 by yjwu
> Mon Oct  8 22:35:34 2018
> [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++
> --with-fc=gfortran --download-mpich --download-fblaslapack
> [1]PETSC ERROR: #1 KSPView() line 118 in
> /home/yjwu/petsc-3.10.1/src/ksp/ksp/interface/itcreate.c
> Arguments must have same communicators
> [2]PETSC ERROR: Different communicators in the two objects: Argument # 1
> and 2 flag 3
> [2]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
>
> If I want to get the information of KSP and PC in each block, what should
> I do?
>
> 2. There is a requirement in my program to use Matrix-Free method to
> approximate Jacobian matrix by finite difference of residual function in
> solving nonlinear equations. But I'll also provide an analytic( incomplete,
> some terms are missing or approximate) for preconditioning. Because my
> problem is about a large set of equations composed of several physical
> fields, I want to use block Jacobian precondition for each subfield(block),
> and ILU sub_pc for each subfield. After reading the Users'Guide, I found
> that using - snex_mf_operator can do the above, so I added:
>  - snes_mf_operator
>

You need to be careful what matrix you are adding values to in your
FormJacobian() routine. The primal matrix J is of type MFFD (finite
difference)
and thus cannot accept values. You put your approximate values in the
preconditioner M.

  Thanks,

Matt


> after the 

Re: [petsc-users] Increasing norm with finer mesh

2018-10-09 Thread Mark Adams
To reiterate what Matt is saying, you seem to have the exact solution on a
10x10 grid. That makes no sense unless the solution can be represented
exactly by your FE space (eg, u(x,y) = x + y).

On Mon, Oct 8, 2018 at 9:33 PM Matthew Knepley  wrote:

> On Mon, Oct 8, 2018 at 9:28 PM Weizhuo Wang  wrote:
>
>> The code is attached in case anyone wants to take a look, I will try the
>> high frequency scenario later.
>>
>
> That is not the error. It is superconvergence at the vertices. The real
> solution is trigonometric, so your
> linear interpolants or whatever you use is not going to get the right
> value in between mesh points. You
> need to do a real integral over the whole interval to get the L_2 error.
>
>   Thanks,
>
>  Matt
>
>
>> On Mon, Oct 8, 2018 at 7:58 PM Mark Adams  wrote:
>>
>>>
>>>
>>> On Mon, Oct 8, 2018 at 6:58 PM Weizhuo Wang 
>>> wrote:
>>>
 The first plot is the norm with the flag -pc_type lu with respect to
 number of grids in one axis (n), and the second plot is the norm without
 the flag -pc_type lu.

>>>
>>> So you are using the default PC w/o LU. The default is ILU. This will
>>> reduce high frequency effectively but is not effective on the low frequency
>>> error. Don't expect your algebraic error reduction to be at the same scale
>>> as the residual reduction (what KSP measures).
>>>
>>>

>>
>> --
>> Wang Weizhuo
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> 
>