Re: [petsc-users] Unstructured mesh

2021-12-13 Thread Karthikeyan Chockalingam - STFC UKRI
I tried adding the -mat_block_size 3 but I still get the same error message. Thanks, Karthik. From: Mark Adams Date: Monday, 13 December 2021 at 19:54 To: "Chockalingam, Karthikeyan (STFC,DL,HC)" Cc: Matthew Knepley , "petsc-users@mcs.anl.gov" Subject: Re: [petsc-users] Unstructured mesh

Re: [petsc-users] Unstructured mesh

2021-12-13 Thread Mark Adams
Try adding -mat_block_size 3 On Mon, Dec 13, 2021 at 11:57 AM Karthikeyan Chockalingam - STFC UKRI < karthikeyan.chockalin...@stfc.ac.uk> wrote: > I tried to run the problem using -pc_type hypre but it errored out: > > > > ./ex56 -cells 4,4,2 -max_conv_its 1 -lx 1. -alpha .01

Re: [petsc-users] Tips on integrating MPI ksp petsc into my application?

2021-12-13 Thread Barry Smith
Sorry, I didn't notice these emails for a long time. PETSc does provide a "simple" mechanism to redistribute your matrix that does not require you to explicitly do the redistribution. You must create a MPIAIJ matrix over all the MPI ranks, but simply provide all the rows on the first

Re: [petsc-users] Finite difference approximation of Jacobian

2021-12-13 Thread Tang, Qi
“overallocating” is exactly what we can live on at the moment, as long as it is easier to work with coloring on dmstag. So it sounds like if we can provide a preallocated matrix with a proper stencil through DMCreateMatrix, then it should work with dmstag and coloring already. Most APIs are

Re: [petsc-users] Finite difference approximation of Jacobian

2021-12-13 Thread Dave May
On Mon, 13 Dec 2021 at 20:13, Matthew Knepley wrote: > On Mon, Dec 13, 2021 at 1:52 PM Dave May wrote: > >> On Mon, 13 Dec 2021 at 19:29, Matthew Knepley wrote: >> >>> On Mon, Dec 13, 2021 at 1:16 PM Dave May >>> wrote: >>> On Sat 11. Dec 2021 at 22:28, Matthew Knepley

Re: [petsc-users] Finite difference approximation of Jacobian

2021-12-13 Thread Matthew Knepley
On Mon, Dec 13, 2021 at 1:52 PM Dave May wrote: > On Mon, 13 Dec 2021 at 19:29, Matthew Knepley wrote: > >> On Mon, Dec 13, 2021 at 1:16 PM Dave May wrote: >> >>> >>> >>> On Sat 11. Dec 2021 at 22:28, Matthew Knepley wrote: >>> On Sat, Dec 11, 2021 at 1:58 PM Tang, Qi wrote: >

Re: [petsc-users] Finite difference approximation of Jacobian

2021-12-13 Thread Dave May
On Mon, 13 Dec 2021 at 19:55, Tang, Qi wrote: > Matt and Dave, > > Thanks, this is consistent with what we found. If Patrick or someone can > add some basic coloring option with DMStag, that would be very useful for > our project. > > Colouring only requires the non-zero structure of the matrix.

Re: [petsc-users] Finite difference approximation of Jacobian

2021-12-13 Thread Tang, Qi
Matt and Dave, Thanks, this is consistent with what we found. If Patrick or someone can add some basic coloring option with DMStag, that would be very useful for our project. Qi On Dec 13, 2021, at 11:52 AM, Dave May mailto:dave.mayhe...@gmail.com>> wrote: On Mon, 13 Dec 2021 at 19:29,

Re: [petsc-users] Finite difference approximation of Jacobian

2021-12-13 Thread Dave May
On Mon, 13 Dec 2021 at 19:29, Matthew Knepley wrote: > On Mon, Dec 13, 2021 at 1:16 PM Dave May wrote: > >> >> >> On Sat 11. Dec 2021 at 22:28, Matthew Knepley wrote: >> >>> On Sat, Dec 11, 2021 at 1:58 PM Tang, Qi wrote: >>> Hi, Does anyone have comment on finite difference

Re: [petsc-users] Finite difference approximation of Jacobian

2021-12-13 Thread Matthew Knepley
On Mon, Dec 13, 2021 at 1:16 PM Dave May wrote: > > > On Sat 11. Dec 2021 at 22:28, Matthew Knepley wrote: > >> On Sat, Dec 11, 2021 at 1:58 PM Tang, Qi wrote: >> >>> Hi, >>> Does anyone have comment on finite difference coloring with DMStag? We >>> are using DMStag and TS to evolve some

Re: [petsc-users] Finite difference approximation of Jacobian

2021-12-13 Thread Dave May
On Sat 11. Dec 2021 at 22:28, Matthew Knepley wrote: > On Sat, Dec 11, 2021 at 1:58 PM Tang, Qi wrote: > >> Hi, >> Does anyone have comment on finite difference coloring with DMStag? We >> are using DMStag and TS to evolve some nonlinear equations implicitly. It >> would be helpful to have the

Re: [petsc-users] [EXTERNAL] Re: Finite difference approximation of Jacobian

2021-12-13 Thread Jorti, Zakariae via petsc-users
Hi Matt, Thanks for the reply. I tested the following flags. - With -snes_fd_color and -snes_fd_color_use_mat I got a segmentation violation. I was able to get the call stack as I am using a debugger: MatCreateSubmatrix_MPIAIJ_All MatGetSeqNonzeroStructure_MPIAIJ

Re: [petsc-users] Unstructured mesh

2021-12-13 Thread Karthikeyan Chockalingam - STFC UKRI
I tried to run the problem using -pc_type hypre but it errored out: ./ex56 -cells 4,4,2 -max_conv_its 1 -lx 1. -alpha .01 -petscspace_degree 1 -ksp_type cg -ksp_monitor -ksp_rtol 1.e-8 -pc_type hypre -pc_hypre_type boomeramg -snes_monitor -use_mat_nearnullspace true -snes_rtol 1.e-10

Re: [petsc-users] Unstructured mesh

2021-12-13 Thread Karthikeyan Chockalingam - STFC UKRI
Sorry, looks like I have not only misunderstood your question but also your recommendation to run using -ex56_dm_mat_type. I didn’t realize one needs to add the prefix -ex56. Kind regards, Karthik. From: Matthew Knepley Date: Monday, 13 December 2021 at 14:44 To: "Chockalingam, Karthikeyan

Re: [petsc-users] Unstructured mesh

2021-12-13 Thread Matthew Knepley
On Mon, Dec 13, 2021 at 9:40 AM Karthikeyan Chockalingam - STFC UKRI < karthikeyan.chockalin...@stfc.ac.uk> wrote: > *@Mark Adams *Yes, it worked with *-ex56_dm_mat_type > mpiaijcusparse* else it crashes with the error message > > [0]PETSC ERROR: Unknown Mat type given: cusparse > > >

Re: [petsc-users] Unstructured mesh

2021-12-13 Thread Karthikeyan Chockalingam - STFC UKRI
@Mark Adams Yes, it worked with -ex56_dm_mat_type mpiaijcusparse else it crashes with the error message [0]PETSC ERROR: Unknown Mat type given: cusparse @Matthew Knepley Usually PETSc -log_view reports the GPU flops. Alternatively

Re: [petsc-users] Unstructured mesh

2021-12-13 Thread Mark Adams
> It is not. The number of processes is specified independently using > 'mpiexec -n ' or when using the test system NP=. > >> (i) Say I start with -cells 1,1,1 -max_conv its 7; that would >> eventually leave all refinement on level 7 running on 1 MPI process? >> > I don't understand > (ii)

Re: [petsc-users] Unstructured mesh

2021-12-13 Thread Mark Adams
On Mon, Dec 13, 2021 at 8:35 AM Karthikeyan Chockalingam - STFC UKRI < karthikeyan.chockalin...@stfc.ac.uk> wrote: > Thanks Matt. Couple of weeks back you mentioned > > “There are many unstructured grid examples, e.g. SNES ex13, ex17, ex56. > The solver can run on the GPU, but the vector/matrix

Re: [petsc-users] Unstructured mesh

2021-12-13 Thread Matthew Knepley
On Mon, Dec 13, 2021 at 8:35 AM Karthikeyan Chockalingam - STFC UKRI < karthikeyan.chockalin...@stfc.ac.uk> wrote: > Thanks Matt. Couple of weeks back you mentioned > > “There are many unstructured grid examples, e.g. SNES ex13, ex17, ex56. > The solver can run on the GPU, but the vector/matrix

Re: [petsc-users] Unstructured mesh

2021-12-13 Thread Karthikeyan Chockalingam - STFC UKRI
Thanks Matt. Couple of weeks back you mentioned “There are many unstructured grid examples, e.g. SNES ex13, ex17, ex56. The solver can run on the GPU, but the vector/matrix FEM assembly does not. I am working on that now.” I am able to run other examples in ksp/tutorials on gpus. I complied

Re: [petsc-users] Unstructured mesh

2021-12-13 Thread Matthew Knepley
On Mon, Dec 13, 2021 at 7:15 AM Karthikeyan Chockalingam - STFC UKRI < karthikeyan.chockalin...@stfc.ac.uk> wrote: > Thank you. I was able to confirm both the below options produced the same > mesh > > > > ./ex56 -cells 2,2,1 -max_conv_its 2 > > ./ex56 -cells 4,4,2 -max_conv_its 1 > Good > But

Re: [petsc-users] Unstructured mesh

2021-12-13 Thread Karthikeyan Chockalingam - STFC UKRI
Thank you. I was able to confirm both the below options produced the same mesh ./ex56 -cells 2,2,1 -max_conv_its 2 ./ex56 -cells 4,4,2 -max_conv_its 1 But I didn’t get how is -cells i,j,k <1,1,1> is related to the number of MPI processes. (i) Say I start with -cells 1,1,1 -max_conv