Re: [petsc-users] Help using FAS as an initial guess

2023-05-02 Thread tt73
Thanks, Barry. I'll look into it.  Original message From: Barry Smith Date: 5/2/23 6:10 PM (GMT-05:00) To: "Takahashi, Tadanaga" Cc: PETSc Subject: Re: [petsc-users] Help using FAS as an initial guess   You might consider 

Re: [petsc-users] Help using FAS as an initial guess

2023-05-02 Thread Barry Smith
You might consider https://petsc.org/release/manualpages/SNES/SNESSetGridSequence/ it does exactly what I think you want to do. FAS is a bit more subtle than that. The "coarse grid problem" that FAS builds and solves are dependent on the current fine grid solution so you need an

[petsc-users] Help using FAS as an initial guess

2023-05-02 Thread Takahashi, Tadanaga
Hi, I want to know how to configure the FAS so that it solves a problem on a coarse grid of size 4h, interpolate the solution, and then stop. Here is the context: I am using Newton LS to solve a problem on square domain discretized with DMDA meshed with step size h. I have a subroutine to

Re: [petsc-users] Scalable Solver for Incompressible Flow

2023-05-02 Thread Matthew Knepley
On Tue, May 2, 2023 at 2:29 PM Jed Brown wrote: > Sebastian Blauth writes: > > > I agree with your comment for the Stokes equations - for these, I have > > already tried and used the pressure mass matrix as part of a (additive) > > block preconditioner and it gave mesh independent results. > >

Re: [petsc-users] Scalable Solver for Incompressible Flow

2023-05-02 Thread Jed Brown
Sebastian Blauth writes: > I agree with your comment for the Stokes equations - for these, I have > already tried and used the pressure mass matrix as part of a (additive) > block preconditioner and it gave mesh independent results. > > However, for the Navier Stokes equations, is the Schur

Re: [petsc-users] Scalable Solver for Incompressible Flow

2023-05-02 Thread Sebastian Blauth
On 02.05.2023 15:12, Matthew Knepley wrote: On Tue, May 2, 2023 at 9:07 AM Blauth, Sebastian > wrote: Hello, __ __ I am having a problem using / configuring PETSc to obtain a scalable solver for the incompressible Navier

Re: [petsc-users] DMSWARM with DMDA and KSP

2023-05-02 Thread Matthew Young
Yup -- I realized that I had '-pc_type mg' in the script I was using to build and run as I developed. I guess that was causing the KSP to coarsen its DM, which made me think I had to force the density DM to be consistent. Still, refactoring my original grid DM into one for Vlasov components and

Re: [petsc-users] Node numbering in parallel partitioned mesh

2023-05-02 Thread Matthew Knepley
On Tue, May 2, 2023 at 11:03 AM Karthikeyan Chockalingam - STFC UKRI < karthikeyan.chockalin...@stfc.ac.uk> wrote: > Thank you Matt. > > > > I will look to find out those shared nodes. Sorry, I didn’t get it when > you say “Roots are owned, and leaves are not owned” > That is the nomenclature

Re: [petsc-users] Node numbering in parallel partitioned mesh

2023-05-02 Thread Karthikeyan Chockalingam - STFC UKRI via petsc-users
Thank you Matt. I will look to find out those shared nodes. Sorry, I didn’t get it when you say “Roots are owned, and leaves are not owned” My question was specifically related to numbering – how do I start numbering in a partition from where I left off from the previous partition without

Re: [petsc-users] Node numbering in parallel partitioned mesh

2023-05-02 Thread Barry Smith
Assuming you have generated your renumbering, you can use https://petsc.org/release/manualpages/AO/AO/#ao to convert lists in the old (or new) numbering to the new (or old) numbering. Barry > On May 2, 2023, at 8:34 AM, Matthew Knepley wrote: > > On Tue, May 2, 2023 at 8:25 AM

Re: [petsc-users] 'mpirun' run not found error

2023-05-02 Thread Barry Smith
For any PETSc install you can run make getmpiexec in the PETSC_DIR directory to see how to use mpiexec for that PETSc install. Barry > On May 2, 2023, at 2:56 AM, ­권승리 / 학생 / 항공우주공학과 wrote: > > Dear developers > > I'm trying to use the mpi, but I'm encountering error messages like

Re: [petsc-users] Scalable Solver for Incompressible Flow

2023-05-02 Thread Matthew Knepley
On Tue, May 2, 2023 at 9:07 AM Blauth, Sebastian < sebastian.bla...@itwm.fraunhofer.de> wrote: > Hello, > > > > I am having a problem using / configuring PETSc to obtain a scalable > solver for the incompressible Navier Stokes equations. I am discretizing > the equations using FEM (with the

[petsc-users] Scalable Solver for Incompressible Flow

2023-05-02 Thread Blauth, Sebastian
Hello, I am having a problem using / configuring PETSc to obtain a scalable solver for the incompressible Navier Stokes equations. I am discretizing the equations using FEM (with the library fenics) and I am using the stable P2-P1 Taylor-Hood elements. I have read and tried a lot regarding

Re: [petsc-users] Node numbering in parallel partitioned mesh

2023-05-02 Thread Matthew Knepley
On Tue, May 2, 2023 at 8:25 AM Karthikeyan Chockalingam - STFC UKRI via petsc-users wrote: > Hello, > > > > This is not exactly a PETSc question. I have a parallel partitioned finite > element mesh. What are the steps involved in having a contiguous but unique > set of node numbering from one

[petsc-users] Node numbering in parallel partitioned mesh

2023-05-02 Thread Karthikeyan Chockalingam - STFC UKRI via petsc-users
Hello, This is not exactly a PETSc question. I have a parallel partitioned finite element mesh. What are the steps involved in having a contiguous but unique set of node numbering from one partition to the next? There are nodes which are shared between different partitions. Moreover, this

Re: [petsc-users] 'mpirun' run not found error

2023-05-02 Thread Pierre Jolivet
> On 2 May 2023, at 8:56 AM, ­권승리 / 학생 / 항공우주공학과 wrote: > > Dear developers > > I'm trying to use the mpi, but I'm encountering error messages like below: > > > Command 'mpirun' not found, but can be installed with: > sudo apt install lam-runtime # version 7.1.4-6build2, or >

[petsc-users] 'mpirun' run not found error

2023-05-02 Thread ­권승리 / 학생 / 항공우주공학과
Dear developers I'm trying to use the mpi, but I'm encountering error messages like below: Command 'mpirun' not found, but can be installed with: sudo apt install lam-runtime # version 7.1.4-6build2, or sudo apt install mpich # version 3.3.2-2build1 sudo apt install