[petsc-users] Help using FAS as an initial guess

2023-05-02 Thread Takahashi, Tadanaga
Hi,

I want to know how to configure the FAS so that it solves a problem
on a coarse grid of size 4h, interpolate the solution, and then stop.

Here is the context: I am using Newton LS to solve a problem on square
domain discretized with DMDA meshed with step size h. I have a subroutine
to compute the initial guess. I want this subroutine to first do a Newton
solve on a coarse grid of size 4h. Then it interpolates the solution to the
main mesh. I think this is achievable by using one iteration of FAS. Below
is the gist of what my subroutine looks like:

PetscErrorCode InitialState(DM da, Vec u) { //
SNES snes;
SNESCreate(PETSC_COMM_WORLD,);
SNESSetDM(snes,da);
SNESSetType(snes,SNESFAS); // solve with multigrid
SNESSetTolerances(snes,PETSC_DEFAULT,PETSC_DEFAULT,PETSC_DEFAULT,1,PETSC_DEFAULT);
// just one iteration
VecSet(u,0.0);// start with zeros
SNESSolve(snes,NULL,u);   // cheap solve
SNESGetSolution(snes,); // extract solution
}

For some reason, my initial guess is too accurate. The initial guess
produced by this subroutine looks exactly like the final solution. My guess
is that it's doing more than what I want it to do. I'm still new to FAS.
How can I tell FAS to do just one crude solve and an interpolation?


Re: [petsc-users] Question about NASM initialization

2023-04-06 Thread Takahashi, Tadanaga
Ok, thanks for the clarification.

On Thu, Apr 6, 2023 at 10:25 AM Matthew Knepley  wrote:

> On Thu, Apr 6, 2023 at 10:21 AM Takahashi, Tadanaga  wrote:
>
>> I am following up from the last inquiry. I read the source code nasm.c
>> <https://petsc.org/release/src/snes/impls/nasm/nasm.c.html> and it looks
>> like sub-snes iteration is being initialized with a scatter call from the
>> previous solution. In other words, if I use Newton's method for the local
>> solver, then in each NASM iteration the Newton's method uses the previous
>> local solution as the initial guess. Can anyone confirm this?
>>
>
> This is the intention. There are not many tests, so it is possible there
> is a bug, but it is supposed to use the existing solution.
>
>   Thanks,
>
> Matt
>
>
>> On Sun, Apr 2, 2023 at 6:14 PM Takahashi, Tadanaga  wrote:
>>
>>> Hello PETSc devs,
>>>
>>> I am using SNES NASM with Newton LS on the sub-SNES. I was wondering how
>>> the sub-SNES chooses the initial guess during each NASM iteration. Is it
>>> using the previously computed solution or is it restarting from zero?
>>>
>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>


Re: [petsc-users] Question about NASM initialization

2023-04-06 Thread Takahashi, Tadanaga
I am following up from the last inquiry. I read the source code nasm.c
<https://petsc.org/release/src/snes/impls/nasm/nasm.c.html> and it looks
like sub-snes iteration is being initialized with a scatter call from the
previous solution. In other words, if I use Newton's method for the local
solver, then in each NASM iteration the Newton's method uses the previous
local solution as the initial guess. Can anyone confirm this?

On Sun, Apr 2, 2023 at 6:14 PM Takahashi, Tadanaga  wrote:

> Hello PETSc devs,
>
> I am using SNES NASM with Newton LS on the sub-SNES. I was wondering how
> the sub-SNES chooses the initial guess during each NASM iteration. Is it
> using the previously computed solution or is it restarting from zero?
>


[petsc-users] Question about NASM initialization

2023-04-02 Thread Takahashi, Tadanaga
Hello PETSc devs,

I am using SNES NASM with Newton LS on the sub-SNES. I was wondering how
the sub-SNES chooses the initial guess during each NASM iteration. Is it
using the previously computed solution or is it restarting from zero?


[petsc-users] How to get total subsnes iterations

2022-10-12 Thread Takahashi, Tadanaga
Hi. I am using the snes nasm for the global solver and snes newtonls for
the local subdomain solver. I am trying to get the total number of Newton
iterations for just one subdomain. I've tried:

SNESNASMGetSNES(snes,0,);
SNESSolve(snes,NULL,u_initial);
SNESGetNumberFunctionEvals(subsnes,_its);

but this just gets me the number of Newton iterations just on the final
nasm iteration. If I understand correctly, the information in the subsnes
is repeatedly destroyed while the SNESSolve is running. Is there any way to
extract the total subsnes iterations after the SNESSolve? If not, how would
I extract the information?


Re: [petsc-users] Customizing NASM subsnes

2022-06-17 Thread Takahashi, Tadanaga
Ahh, I understand now. I got rid of the loop. Adding
SNESSetFromOptions(subsnes) right after
SNESSetOptionsPrefix(subsnes,prefix) did not fix the issue.

On Fri, Jun 17, 2022 at 11:02 AM Barry Smith  wrote:

>
>   You do not need the loop over size. Each rank sets options and options
> prefix for its local objects and never anyone elses.
>
>char prefix[10];
>
>sprintf(prefix,"sub_%d_",rank);
>
>SNESNASMGetSNES(snes,0,);
>
>  if (rank  SNESSetType(subsnes,SNESNEWTONLS); CHKERRQ(ierr); // newton
> for regular domains
>  } else {
> SNESSetType(subsnes,SNESFAS); CHKERRQ(ierr); // fas for last
> domain
>  }
>  SNESSetOptionsPrefix(subsnes,prefix);
>   }
>}
>
>
>To get the prefix to work try calling   SNESSetFromOptions(subsnes);
> immediately after your SNESSetOptionsPrefix(subsnes,prefix); call
>
>Matt, it looks like there may be a bug in NASM, except in one
> particular case, it never calls SNESSetFromOptions() on the subsenses.
>
>   Barry
>
>
> On Jun 17, 2022, at 10:47 AM, Takahashi, Tadanaga  wrote:
>
> Thank you. I am now able to pull each subsnes, change its snes type
> through the API, and set a prefix. This is my updated code:
>
>SNES   snes, subsnes;
>PetscMPIIntrank, size;
>...
>ierr = SNESCreate(PETSC_COMM_WORLD,); CHKERRQ(ierr);
>ierr = SNESSetType(snes,SNESNASM); CHKERRQ(ierr);
>ierr = SNESNASMSetType(snes,PC_ASM_RESTRICT); CHKERRQ(ierr);
>...
>ierr = SNESSetFromOptions(snes); CHKERRQ(ierr);
>ierr = SNESSetUp(snes); CHKERRQ(ierr);
>PetscPrintf(PETSC_COMM_WORLD, "Size = %d\n",size);
>PetscBarrier(NULL);
>for (i=0; i   char prefix[10];
>   sprintf(prefix,"sub_%d_",i);
>   if(i==rank) {
>  ierr = SNESNASMGetNumber(snes,);
>  printf("rank = %d has %d block(s)\n",i,Nd);
>  if (i  SNESNASMGetSNES(snes,0,);
> SNESSetType(subsnes,SNESNEWTONLS); CHKERRQ(ierr); // newton
> for regular domains
>  } else {
> SNESNASMGetSNES(snes,0,);
> SNESSetType(subsnes,SNESFAS); CHKERRQ(ierr); // fas for last
> domain
>  }
>  SNESSetOptionsPrefix(subsnes,prefix);
>   }
>}
>ierr = SNESSetFromOptions(snes); CHKERRQ(ierr);
>...
>ierr = SNESSolve(snes,NULL,u_initial); CHKERRQ(ierr);
>
> However, I still cannot change SNES, KSP, and PC types for
> individual domains through the command arguments. I checked the subdomains
> with -snes_view ::ascii_info_detail and it does show that the prefixes
> are properly changed. It also shows that the SNES type for the last domain
> was successfully changed. But for some reason, I only have access to the
> SNES viewer options during runtime. For example, if I run mpiexec -n 4
> ./test1 -sub_0_ksp_type gmres -help | grep sub_0 I get the output:
>
> Viewer (-sub_0_snes_convergence_estimate) options:
>   -sub_0_snes_convergence_estimate ascii[:[filename][:[format][:append]]]:
> Prints object to stdout or ASCII file (PetscOptionsGetViewer)
>   -sub_0_snes_convergence_estimate
> binary[:[filename][:[format][:append]]]: Saves object to a binary file
> (PetscOptionsGetViewer)
>   -sub_0_snes_convergence_estimate draw[:[drawtype][:filename|format]]
> Draws object (PetscOptionsGetViewer)
>   -sub_0_snes_convergence_estimate socket[:port]: Pushes object to a Unix
> socket (PetscOptionsGetViewer)
>   -sub_0_snes_convergence_estimate saws[:communicatorname]: Publishes
> object to SAWs (PetscOptionsGetViewer)
> Viewer (-sub_0_snes_view_pre) options:
>   -sub_0_snes_view_pre ascii[:[filename][:[format][:append]]]: Prints
> object to stdout or ASCII file (PetscOptionsGetViewer)
>   -sub_0_snes_view_pre binary[:[filename][:[format][:append]]]: Saves
> object to a binary file (PetscOptionsGetViewer)
>   -sub_0_snes_view_pre draw[:[drawtype][:filename|format]] Draws object
> (PetscOptionsGetViewer)
>   -sub_0_snes_view_pre socket[:port]: Pushes object to a Unix socket
> (PetscOptionsGetViewer)
>   -sub_0_snes_view_pre saws[:communicatorname]: Publishes object to SAWs
> (PetscOptionsGetViewer)
> Viewer (-sub_0_snes_test_jacobian_view) options:
>   -sub_0_snes_test_jacobian_view ascii[:[filename][:[format][:append]]]:
> Prints object to stdout or ASCII file (PetscOptionsGetViewer)
>   -sub_0_snes_test_jacobian_view binary[:[filename][:[format][:append]]]:
> Saves object to a binary file (PetscOptionsGetViewer)
>   -sub_0_snes_test_jacobian_view draw[:[drawtype][:filename|format]] Draws
> object (PetscOptionsGetViewer)
>   -sub_0_snes_test_jacobian_view socket[:port]: Pushes ob

Re: [petsc-users] Customizing NASM subsnes

2022-06-17 Thread Takahashi, Tadanaga
Thank you. I am now able to pull each subsnes, change its snes type through
the API, and set a prefix. This is my updated code:

   SNES   snes, subsnes;
   PetscMPIIntrank, size;
   ...
   ierr = SNESCreate(PETSC_COMM_WORLD,); CHKERRQ(ierr);
   ierr = SNESSetType(snes,SNESNASM); CHKERRQ(ierr);
   ierr = SNESNASMSetType(snes,PC_ASM_RESTRICT); CHKERRQ(ierr);
   ...
   ierr = SNESSetFromOptions(snes); CHKERRQ(ierr);
   ierr = SNESSetUp(snes); CHKERRQ(ierr);
   PetscPrintf(PETSC_COMM_WORLD, "Size = %d\n",size);
   PetscBarrier(NULL);
   for (i=0; i wrote:

>
>  MPI_Comm_size(PETSC_COMM_WORLD,);
>  MPI_Comm_rank(PETSC_COMM_WORLD,);
>
> SNESNASMGetSNES(snes,0,);
>>  char prefix[10];
>>  sprintf(prefix,"sub_%d_",rank);
>>  SNESSetOptionsPrefix(subsnes,prefix);
>>
>
>
>
> On Jun 17, 2022, at 9:35 AM, Matthew Knepley  wrote:
>
> On Fri, Jun 17, 2022 at 9:22 AM Takahashi, Tadanaga  wrote:
>
>> I'm having some trouble pulling out the subsolver. I tried to use
>> SNESNASMGetSNES in a loop over each subdomain. However I get an error when
>> I run the code with more than one MPI processors. Here is a snippet from my
>> code:
>>
>>SNES   snes, subsnes;
>>PetscMPIIntrank, size;
>>...
>>ierr = SNESCreate(PETSC_COMM_WORLD,); CHKERRQ(ierr);
>>ierr = SNESSetType(snes,SNESNASM); CHKERRQ(ierr);
>>ierr = SNESNASMSetType(snes,PC_ASM_RESTRICT); CHKERRQ(ierr);
>>...
>>ierr = SNESSetFromOptions(snes); CHKERRQ(ierr);
>>ierr = SNESSetUp(snes); CHKERRQ(ierr);
>>PetscPrintf(PETSC_COMM_WORLD, "Size = %d\n",size);
>>for (i=0; i>   PetscPrintf(PETSC_COMM_WORLD, "rank = %d\n",i);
>>   SNESNASMGetSNES(snes,i,);
>>   // char prefix[10];
>>   // sprintf(prefix,"sub_%d_",i);
>>   // SNESSetOptionsPrefix(subsnes,prefix);
>>}
>>...
>>ierr = SNESSolve(snes,NULL,u_initial); CHKERRQ(ierr);
>>
>>
>> And, here is the output of the code when I run with 2 MPI procs:
>>
>
> SNESNASMGetSNES() gets the local subsolvers. It seems you only have one
> per process.
> You can check
> https://petsc.org/main/docs/manualpages/SNES/SNESNASMGetNumber/
>
> Notice that your current code will not work because, according to your
> explanation, you only want to change
> the prefix on a single rank, so you need to check the rank when you do it.
>
>   Thanks,
>
>  Matt
>
>
>> takahashi@ubuntu:~/Desktop/MA-DDM/C/Rectangle$ mpiexec -n 2 ./test1
>> Size = 2
>> rank = 0
>> rank = 1
>> [0]PETSC ERROR: - Error Message
>> --
>> [0]PETSC ERROR: Argument out of range
>> [0]PETSC ERROR: No such subsolver
>> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
>> [0]PETSC ERROR: Petsc Release Version 3.17.1, unknown
>> [0]PETSC ERROR: ./test1 on a linux-gnu-c-debug named ubuntu by takahashi
>> Fri Jun 17 06:06:38 2022
>> [0]PETSC ERROR: Configure options --with-mpi-dir=/usr --with-fc=0
>> [0]PETSC ERROR: #1 SNESNASMGetSNES() at
>> /home/takahashi/Desktop/petsc/src/snes/impls/nasm/nasm.c:923
>>
>>
>> ===
>> =   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
>> =   RANK 0 PID 976566 RUNNING AT ubuntu
>> =   KILLED BY SIGNAL: 9 (Killed)
>>
>> ===
>>
>> This error doesn't occur when I run this without MPI. However, I tried to
>> change the prefix of the subdomain to `sub_0_` but I am not able to change
>> the snes_type using this prefix. Running ./test1 -snes_view -help | grep
>> sub_0_snes_type prints nothing.
>>
>> On Thu, Jun 16, 2022 at 6:23 PM Matthew Knepley 
>> wrote:
>>
>>> On Thu, Jun 16, 2022 at 5:57 PM tt73  wrote:
>>>
>>>>
>>>> Hi,
>>>>
>>>> I am using  NASM as the outer solver for a nonlinear problem. For one
>>>> of the subdomains, I want to run the local solve with a different set of
>>>> options form the others. Is there any way to set options for each
>>>> subdomain?
>>>>
>>>
>>> I can see two ways:
>>>
>>>   1) Pull out the subsolver and set it using the API
>>>
>>>   2) Pull out the subsolver and give it a different prefix
>>>
>>>   Thanks,
>>>
>>>  Matt
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>> https://www.cse.buffalo.edu/~knepley/
>>> <http://www.cse.buffalo.edu/~knepley/>
>>>
>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
>
>


Re: [petsc-users] Customizing NASM subsnes

2022-06-17 Thread Takahashi, Tadanaga
I'm having some trouble pulling out the subsolver. I tried to use
SNESNASMGetSNES in a loop over each subdomain. However I get an error when
I run the code with more than one MPI processors. Here is a snippet from my
code:

   SNES   snes, subsnes;
   PetscMPIIntrank, size;
   ...
   ierr = SNESCreate(PETSC_COMM_WORLD,); CHKERRQ(ierr);
   ierr = SNESSetType(snes,SNESNASM); CHKERRQ(ierr);
   ierr = SNESNASMSetType(snes,PC_ASM_RESTRICT); CHKERRQ(ierr);
   ...
   ierr = SNESSetFromOptions(snes); CHKERRQ(ierr);
   ierr = SNESSetUp(snes); CHKERRQ(ierr);
   PetscPrintf(PETSC_COMM_WORLD, "Size = %d\n",size);
   for (i=0; ihttps://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.17.1, unknown
[0]PETSC ERROR: ./test1 on a linux-gnu-c-debug named ubuntu by takahashi
Fri Jun 17 06:06:38 2022
[0]PETSC ERROR: Configure options --with-mpi-dir=/usr --with-fc=0
[0]PETSC ERROR: #1 SNESNASMGetSNES() at
/home/takahashi/Desktop/petsc/src/snes/impls/nasm/nasm.c:923

===
=   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
=   RANK 0 PID 976566 RUNNING AT ubuntu
=   KILLED BY SIGNAL: 9 (Killed)
===

This error doesn't occur when I run this without MPI. However, I tried to
change the prefix of the subdomain to `sub_0_` but I am not able to change
the snes_type using this prefix. Running ./test1 -snes_view -help | grep
sub_0_snes_type prints nothing.

On Thu, Jun 16, 2022 at 6:23 PM Matthew Knepley  wrote:

> On Thu, Jun 16, 2022 at 5:57 PM tt73  wrote:
>
>>
>> Hi,
>>
>> I am using  NASM as the outer solver for a nonlinear problem. For one of
>> the subdomains, I want to run the local solve with a different set of
>> options form the others. Is there any way to set options for each
>> subdomain?
>>
>
> I can see two ways:
>
>   1) Pull out the subsolver and set it using the API
>
>   2) Pull out the subsolver and give it a different prefix
>
>   Thanks,
>
>  Matt
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> 
>


Re: [petsc-users] Convergence issues for SNES NASM

2022-05-12 Thread Takahashi, Tadanaga
I'm still relatively new to PETSc. I was using DMDASetUniformCoordinates
and DMGetBoundingBox together. In hindsight, it was a very
unnecessary thing to do. I think the simplest way to prevent anyone else
from making the same mistake is to add a caveat to the DMGetBoundingBox
documentation page.

On Thu, May 12, 2022 at 2:02 PM Matthew Knepley  wrote:

> On Thu, May 12, 2022 at 1:03 PM Takahashi, Tadanaga  wrote:
>
>> Thank you for the feedback. We figured out what was causing the issue. We
>> were using DMGetBoundingBox
>> <https://petsc.org/main/docs/manualpages/DM/DMGetBoundingBox/> in order
>> to get the limits of the global domain, but gmin and gmax contained limits
>> for the local subdomains when we ran the code with NASM. Hence, our local
>> coordinates xi and yj were completely wrong. The documentation states
>> that DMGetBoundingBox gets the global limits. I believe this is a mistake.
>>
>
> I think I can explain this, and maybe you can tell us how to improve the
> documentation.
>
> I believe we make a new DM that comprises only the subdomain. Then the
> bounding box for this subdomain will only contain itself, not the original
> domain.
> Where should we say this?
>
>   Thanks,
>
>  Matt
>
>
>> This is our new output:
>> $ mpiexec -n 4 ./test1 -t1_N 20 -snes_max_it 50 -snes_monitor -snes_view
>> -da_overlap 3 -snes_type nasm -snes_nasm_type restrict
>>   0 SNES Function norm 7.244681057908e+02
>>   1 SNES Function norm 4.394913250889e+01
>>   2 SNES Function norm 1.823326663029e+01
>>   3 SNES Function norm 7.033938512358e+00
>>   4 SNES Function norm 2.797351504285e+00
>>   5 SNES Function norm 1.13061336e+00
>>   6 SNES Function norm 4.605418417192e-01
>>   7 SNES Function norm 1.882307001920e-01
>>   8 SNES Function norm 7.704148683921e-02
>>   9 SNES Function norm 3.155090858782e-02
>>  10 SNES Function norm 1.292418188473e-02
>>  11 SNES Function norm 5.294645671797e-03
>>  12 SNES Function norm 2.169143207557e-03
>>  13 SNES Function norm 8.886826738192e-04
>>  14 SNES Function norm 3.640894847145e-04
>>  15 SNES Function norm 1.491663153414e-04
>>  16 SNES Function norm 6.111303899450e-05
>>  17 SNES Function norm 2.503785968501e-05
>>  18 SNES Function norm 1.025795062417e-05
>>  19 SNES Function norm 4.202657921479e-06
>> SNES Object: 4 MPI processes
>>   type: nasm
>> total subdomain blocks = 4
>> Local solver information for first block on rank 0:
>> Use -snes_view ::ascii_info_detail to display information for all
>> blocks
>> SNES Object: (sub_) 1 MPI processes
>>   type: newtonls
>>   maximum iterations=50, maximum function evaluations=1
>>   tolerances: relative=1e-08, absolute=1e-50, solution=1e-08
>>   total number of linear solver iterations=2
>>   total number of function evaluations=3
>>   norm schedule ALWAYS
>>   Jacobian is built using a DMDA local Jacobian
>>   SNESLineSearch Object: (sub_) 1 MPI processes
>> type: bt
>>   interpolation: cubic
>>   alpha=1.00e-04
>> maxstep=1.00e+08, minlambda=1.00e-12
>> tolerances: relative=1.00e-08, absolute=1.00e-15,
>> lambda=1.00e-08
>> maximum iterations=40
>>   KSP Object: (sub_) 1 MPI processes
>> type: preonly
>> maximum iterations=1, initial guess is zero
>> tolerances:  relative=1e-05, absolute=1e-50, divergence=1.
>> left preconditioning
>> using NONE norm type for convergence test
>>   PC Object: (sub_) 1 MPI processes
>> type: lu
>>   out-of-place factorization
>>   tolerance for zero pivot 2.22045e-14
>>   matrix ordering: nd
>>   factor fill ratio given 5., needed 2.13732
>> Factored matrix follows:
>>   Mat Object: 1 MPI processes
>> type: seqaij
>> rows=169, cols=169
>> package used to perform factorization: petsc
>> total: nonzeros=13339, allocated nonzeros=13339
>>   using I-node routines: found 104 nodes, limit used is 5
>> linear system matrix = precond matrix:
>> Mat Object: 1 MPI processes
>>   type: seqaij
>>   rows=169, cols=169
>>   total: nonzeros=6241, allocated nonzeros=6241
>>   total number of mallocs used during MatSetValues calls=0
>> not using I-node routines
>>   maximum iterations=50, maxi

Re: [petsc-users] Convergence issues for SNES NASM

2022-05-12 Thread Takahashi, Tadanaga
Thank you for the feedback. We figured out what was causing the issue. We
were using DMGetBoundingBox
<https://petsc.org/main/docs/manualpages/DM/DMGetBoundingBox/> in order to
get the limits of the global domain, but gmin and gmax contained limits for
the local subdomains when we ran the code with NASM. Hence, our local
coordinates xi and yj were completely wrong. The documentation states
that DMGetBoundingBox gets the global limits. I believe this is a mistake.

This is our new output:
$ mpiexec -n 4 ./test1 -t1_N 20 -snes_max_it 50 -snes_monitor -snes_view
-da_overlap 3 -snes_type nasm -snes_nasm_type restrict
  0 SNES Function norm 7.244681057908e+02
  1 SNES Function norm 4.394913250889e+01
  2 SNES Function norm 1.823326663029e+01
  3 SNES Function norm 7.033938512358e+00
  4 SNES Function norm 2.797351504285e+00
  5 SNES Function norm 1.13061336e+00
  6 SNES Function norm 4.605418417192e-01
  7 SNES Function norm 1.882307001920e-01
  8 SNES Function norm 7.704148683921e-02
  9 SNES Function norm 3.155090858782e-02
 10 SNES Function norm 1.292418188473e-02
 11 SNES Function norm 5.294645671797e-03
 12 SNES Function norm 2.169143207557e-03
 13 SNES Function norm 8.886826738192e-04
 14 SNES Function norm 3.640894847145e-04
 15 SNES Function norm 1.491663153414e-04
 16 SNES Function norm 6.111303899450e-05
 17 SNES Function norm 2.503785968501e-05
 18 SNES Function norm 1.025795062417e-05
 19 SNES Function norm 4.202657921479e-06
SNES Object: 4 MPI processes
  type: nasm
total subdomain blocks = 4
Local solver information for first block on rank 0:
Use -snes_view ::ascii_info_detail to display information for all blocks
SNES Object: (sub_) 1 MPI processes
  type: newtonls
  maximum iterations=50, maximum function evaluations=1
  tolerances: relative=1e-08, absolute=1e-50, solution=1e-08
  total number of linear solver iterations=2
  total number of function evaluations=3
  norm schedule ALWAYS
  Jacobian is built using a DMDA local Jacobian
  SNESLineSearch Object: (sub_) 1 MPI processes
type: bt
  interpolation: cubic
  alpha=1.00e-04
maxstep=1.00e+08, minlambda=1.00e-12
tolerances: relative=1.00e-08, absolute=1.00e-15,
lambda=1.00e-08
maximum iterations=40
  KSP Object: (sub_) 1 MPI processes
type: preonly
maximum iterations=1, initial guess is zero
tolerances:  relative=1e-05, absolute=1e-50, divergence=1.
left preconditioning
using NONE norm type for convergence test
  PC Object: (sub_) 1 MPI processes
type: lu
  out-of-place factorization
  tolerance for zero pivot 2.22045e-14
  matrix ordering: nd
  factor fill ratio given 5., needed 2.13732
Factored matrix follows:
  Mat Object: 1 MPI processes
type: seqaij
rows=169, cols=169
package used to perform factorization: petsc
total: nonzeros=13339, allocated nonzeros=13339
  using I-node routines: found 104 nodes, limit used is 5
linear system matrix = precond matrix:
Mat Object: 1 MPI processes
  type: seqaij
  rows=169, cols=169
  total: nonzeros=6241, allocated nonzeros=6241
  total number of mallocs used during MatSetValues calls=0
not using I-node routines
  maximum iterations=50, maximum function evaluations=1
  tolerances: relative=1e-08, absolute=1e-50, solution=1e-08
  total number of function evaluations=20
  norm schedule ALWAYS
  Jacobian is built using a DMDA local Jacobian
problem ex10 on 20 x 20 point 2D grid with d = 3, and eps = 0.082:
  error |u-uexact|_inf = 2.879e-02, |u-uexact|_h = 1.707e-02

On Thu, May 12, 2022 at 9:37 AM Matthew Knepley  wrote:

> Your subdomain solves do not appear to be producing descent whatsoever.
> Possible reasons:
>
>   1) Your subdomain Jacobians are wrong (this is usually the problem)
>
>   2) You have some global coupling field for which local solves give no
> descent. (For this you want nonlinear elimination I think)
>
>   Thanks,
>
>  Matt
>
> On Thu, May 12, 2022 at 9:02 AM Takahashi, Tadanaga  wrote:
>
>> I ran the code with the additional options but the raw output is about
>> 75,000 lines. I cannot paste it directly in the email. The output is in the
>> attached file.
>>
>> On Wed, May 11, 2022 at 11:44 PM Jed Brown  wrote:
>>
>>> Can you add -snes_linesearch_monitor -sub_snes_linesearch_monitor
>>> -ksp_converged_reason and send the output??
>>>
>>> "Takahashi, Tadanaga"  writes:
>>>
>>> > Hello,
>>> >
>>> > We are working on a finite difference solver for a 2D nonlinear PDE
>>> with
>>> > Dirichlet Boundary

[petsc-users] Convergence issues for SNES NASM

2022-05-10 Thread Takahashi, Tadanaga
Hello,

We are working on a finite difference solver for a 2D nonlinear PDE with
Dirichlet Boundary conditions on a rectangular domain. Our goal is to solve
the problem with parallel nonlinear additive Schwarz (NASM) as the outer
solver. Our code is similar to SNES example 5
. In example 5,
the parallel NASM can be executed with a command like `mpiexec -n 4 ./ex5
-mms 3 -snes_type nasm -snes_nasm_type restrict -da_overlap 2` which gives
a convergent result. We assume this is the correct usage. A comment in the
source code for NASM mentions that NASM should be a preconditioner but
there's no documentation on the usage. The Brune paper does not cover
parallel NASM either. We observed that increasing the overlap leads to
fewer Schwarz iterations. The parallelization works seamlessly for an
arbitrary number of subdomains. This is the type of behavior we were
expecting from our code.

Our method uses box-style stencil width d = ceil(N^(1/3)) on a N by N DMDA.
The finite difference stencil consists of 4d+1 points spread out in a
diamond formation. If a stencil point is out of bounds, then it is
projected onto the boundary curve. Since the nodes on the boundary curve
would result in an irregular mesh, we chose not treat boundary nodes as
unknowns as in Example 5. We use DMDACreate2d to create the DA for the
interior points and DMDASNESSetFunctionLocal to associate the residue
function to the SNES object.

Our code works serially. We have also tested our code
with Newton-Krylov-Schwarz (NKS) by running something akin to `mpiexec -n
 ./solve -snes_type newtonls`. We have tested the NKS for several
quantities of subdomains and overlap and the code works as expected. We
have some confidence in the correctness of our code. The overlapping NASM
was implemented in MATLAB so we know the method converges. However, the
parallel NASM will not converge with our PETSc code. We don't understand
why NKS works while NASM does not. The F-norm residue monotonically
decreases and then stagnates.

Here is an example of the output when attempting to run NASM in parallel:
takahashi@ubuntu:~/Desktop/MA-DDM/Cpp/Rectangle$ mpiexec -n 4 ./test1 -t1_N
20 -snes_max_it 50 -snes_monitor -snes_view -da_overlap 3 -snes_type nasm
-snes_nasm_type restrict
  0 SNES Function norm 7.244681057908e+02
  1 SNES Function norm 1.237688062971e+02
  2 SNES Function norm 1.068926073552e+02
  3 SNES Function norm 1.027563237834e+02
  4 SNES Function norm 1.022184806736e+02
  5 SNES Function norm 1.020818227640e+02
  6 SNES Function norm 1.020325629121e+02
  7 SNES Function norm 1.020149036595e+02
  8 SNES Function norm 1.020088110545e+02
  9 SNES Function norm 1.020067198030e+02
 10 SNES Function norm 1.020060034469e+02
 11 SNES Function norm 1.020057582380e+02
 12 SNES Function norm 1.020056743241e+02
 13 SNES Function norm 1.020056456101e+02
 14 SNES Function norm 1.020056357849e+02
 15 SNES Function norm 1.020056324231e+02
 16 SNES Function norm 1.020056312727e+02
 17 SNES Function norm 1.020056308791e+02
 18 SNES Function norm 1.020056307444e+02
 19 SNES Function norm 1.020056306983e+02
 20 SNES Function norm 1.020056306826e+02
 21 SNES Function norm 1.020056306772e+02
 22 SNES Function norm 1.020056306753e+02
 23 SNES Function norm 1.020056306747e+02
 24 SNES Function norm 1.020056306745e+02
 25 SNES Function norm 1.020056306744e+02
 26 SNES Function norm 1.020056306744e+02
 27 SNES Function norm 1.020056306744e+02
 28 SNES Function norm 1.020056306744e+02
 29 SNES Function norm 1.020056306744e+02
 30 SNES Function norm 1.020056306744e+02
 31 SNES Function norm 1.020056306744e+02
 32 SNES Function norm 1.020056306744e+02
 33 SNES Function norm 1.020056306744e+02
 34 SNES Function norm 1.020056306744e+02
 35 SNES Function norm 1.020056306744e+02
 36 SNES Function norm 1.020056306744e+02
 37 SNES Function norm 1.020056306744e+02
 38 SNES Function norm 1.020056306744e+02
 39 SNES Function norm 1.020056306744e+02
 40 SNES Function norm 1.020056306744e+02
 41 SNES Function norm 1.020056306744e+02
 42 SNES Function norm 1.020056306744e+02
 43 SNES Function norm 1.020056306744e+02
 44 SNES Function norm 1.020056306744e+02
 45 SNES Function norm 1.020056306744e+02
 46 SNES Function norm 1.020056306744e+02
 47 SNES Function norm 1.020056306744e+02
 48 SNES Function norm 1.020056306744e+02
 49 SNES Function norm 1.020056306744e+02
 50 SNES Function norm 1.020056306744e+02
SNES Object: 4 MPI processes
  type: nasm
total subdomain blocks = 4
Local solver information for first block on rank 0:
Use -snes_view ::ascii_info_detail to display information for all blocks
SNES Object: (sub_) 1 MPI processes
  type: newtonls
  maximum iterations=50, maximum function evaluations=1
  tolerances: relative=1e-08, absolute=1e-50, solution=1e-08
  total number of linear solver iterations=22
  total number of function evaluations=40
  norm schedule ALWAYS
  Jacobian is built