Re: [petsc-users] parallel computing error

2023-05-05 Thread Pierre Jolivet
> On 5 May 2023, at 2:00 PM, ­권승리 / 학생 / 항공우주공학과 wrote: > > Dear Pierre Jolivet > > Thank you for your explanation. > > I will try to use a converting matrix. > > I know it's really inefficient, but I need an inverse matrix (inv(A)) itself > for my research. > > If parallel computing is

Re: [petsc-users] issues with VecSetValues in petsc 3.19

2023-05-05 Thread Barry Smith
To expand on what Matt said slightly. When you have a preconditioner based on (possibly nested) sub solves one generally "tunes" the solves to minimize time to solution. We recommend doing this by first using very accurate subsolves (when possible using direct solves inside); this tells us

Re: [petsc-users] sources of floating point randomness in JFNK in serial

2023-05-05 Thread Barry Smith
Mark, Thank you. You do have aggressive optimizations: -O3 -march=native, which means out-of-order instructions may be performed thus, two runs may have different order of operations and possibly different round-off values. You could try turning off all of this with -O0 for an

Re: [petsc-users] Understanding index sets for PCGASM

2023-05-05 Thread LEONARDO MUTTI
Thanks a lot. If this can help, we should need (not much more than) the functionalities from https://petsc.org/release/src/ksp/ksp/tutorials/ex62.c.html: - PCGASMSetSubdomains - PCGASMDestroySubdomains - PCGASMGetSubKSP Best, Leonardo Il giorno ven 5 mag 2023 alle ore 17:00 Barry Smith

Re: [petsc-users] issues with VecSetValues in petsc 3.19

2023-05-05 Thread Edoardo alinovi
Hello Barry, Welcome to the party! Thank you guys for your precious suggestions, they are really helpful! It's been a while since I am messing around and I have tested many combinations. Schur + selfp is the best preconditioner, it converges within 5 iters using gmres for inner solvers but it is

Re: [petsc-users] parallel computing error

2023-05-05 Thread Pierre Jolivet
> On 5 May 2023, at 1:25 PM, ­권승리 / 학생 / 항공우주공학과 wrote: > > Dear Matthew Knepley > > However, I've already installed ScaLAPACK. > cd $PETSC_DIR > ./configure --download-mpich --with-debugging=0 COPTFLAGS='-O3 -march=native > -mtune=native' CXXOPTFLAGS='-O3 -march=native -mtune=native'

[petsc-users] Issues creating DMPlex from higher order mesh generated by gmsh

2023-05-05 Thread Vilmer Dahlberg via petsc-users
Hi. I'm trying to read a mesh of higher element order, in this example a mesh consisting of 10-node tetrahedral elements, from gmsh, into PETSC. But It looks like the mesh is not properly being loaded and converted into a DMPlex. gmsh tells me it has generated a mesh with 7087 nodes, but when

Re: [petsc-users] Understanding index sets for PCGASM

2023-05-05 Thread Barry Smith
I will add the two interfaces you requested today. (Likely you may need more also). Barry > On May 4, 2023, at 6:01 PM, Matthew Knepley wrote: > > On Thu, May 4, 2023 at 1:43 PM LEONARDO MUTTI > > wrote: >> Of course, I'll try to explain.

Re: [petsc-users] parallel computing error

2023-05-05 Thread ­권승리 / 학생 / 항공우주공학과
Dear Pierre Jolivet Thank you for your explanation. I will try to use a converting matrix. I know it's really inefficient, but I need an inverse matrix (inv(A)) itself for my research. If parallel computing is difficult to get inv(A), can I run the part related to MatMatSolve with a single

Re: [petsc-users] sources of floating point randomness in JFNK in serial

2023-05-05 Thread Mark Lohry
wow. leaving -O3 and turning off -march=native seems to have made it repeatable. this is on an avx2 cpu if it matters. out-of-order instructions may be performed thus, two runs may have > different order of operations > > this is terrifying if true. the source code path is exactly the same every

Re: [petsc-users] sources of floating point randomness in JFNK in serial

2023-05-05 Thread Mark Lohry
are there any safe subsets of -march=whatever? i had it on to take advantage of simd ops on avx512 chips but never looked so close at the exact results. On Fri, May 5, 2023 at 4:58 PM Barry Smith wrote: > > > On May 5, 2023, at 4:45 PM, Mark Lohry wrote: > > wow. leaving -O3 and turning off

Re: [petsc-users] Understanding index sets for PCGASM

2023-05-05 Thread Barry Smith
Added in barry/2023-05-04/add-pcgasm-set-subdomains see also https://gitlab.com/petsc/petsc/-/merge_requests/6419 Barry > On May 4, 2023, at 11:23 AM, LEONARDO MUTTI > wrote: > > Thank you for the help. > Adding to my example: > call PCGASMSetSubdomains(pc,NSub, subdomains_IS,

Re: [petsc-users] sources of floating point randomness in JFNK in serial

2023-05-05 Thread Barry Smith
> On May 5, 2023, at 4:45 PM, Mark Lohry wrote: > > wow. leaving -O3 and turning off -march=native seems to have made it > repeatable. this is on an avx2 cpu if it matters. > >> out-of-order instructions may be performed thus, two runs may have different >> order of operations >> > >

Re: [petsc-users] parallel computing error

2023-05-05 Thread ­권승리 / 학생 / 항공우주공학과
Dear Matthew Knepley However, I've already installed ScaLAPACK. cd $PETSC_DIR ./configure --download-mpich --with-debugging=0 COPTFLAGS='-O3 -march=native -mtune=native' CXXOPTFLAGS='-O3 -march=native -mtune=native' FOPTFLAGS='-O3 -march=native -mtune=native' --download-mumps --

Re: [petsc-users] Understanding index sets for PCGASM

2023-05-05 Thread LEONARDO MUTTI
Interesting, a priori I'm not sure this will work better, mainly because I'd lose the compact band structure. As for waveform relaxation: I excluded it at first since it appears to be requiring too many CPUs than I have to beat sequential solvers, plus it is more complicated and I have very

Re: [petsc-users] issues with VecSetValues in petsc 3.19

2023-05-05 Thread Edoardo alinovi
Hi Matt, I have some more questions on the fieldsplit saga :) I am running a 1M cell ahmed body case using the following options: "solver": "fgmres", "preconditioner": "fieldsplit", "absTol": 1e-6, "relTol": 0.0, "options":{ "pc_fieldsplit_type": "multiplicative",

Re: [petsc-users] issues with VecSetValues in petsc 3.19

2023-05-05 Thread Matthew Knepley
On Fri, May 5, 2023 at 5:13 AM Edoardo alinovi wrote: > Hi Matt, > > I have some more questions on the fieldsplit saga :) > > I am running a 1M cell ahmed body case using the following options: > > "solver": "fgmres", > "preconditioner": "fieldsplit", > "absTol": 1e-6, >

Re: [petsc-users] parallel computing error

2023-05-05 Thread Matthew Knepley
On Fri, May 5, 2023 at 3:49 AM ­권승리 / 학생 / 항공우주공학과 wrote: > Dear Barry Smith > > Thanks to you, I knew the difference between MATAIJ and MATDENSE. > > However, I still have some problems. > > There is no problem when I run with a single core. But, MatGetFactor error > occurs when using

Re: [petsc-users] Understanding index sets for PCGASM

2023-05-05 Thread Matthew Knepley
On Fri, May 5, 2023 at 2:45 AM LEONARDO MUTTI < leonardo.mutt...@universitadipavia.it> wrote: > Interesting, a priori I'm not sure this will work better, mainly because > I'd lose the compact band structure. > > As for waveform relaxation: I excluded it at first since it appears to be > requiring

Re: [petsc-users] parallel computing error

2023-05-05 Thread ­권승리 / 학생 / 항공우주공학과
Dear Barry Smith Thanks to you, I knew the difference between MATAIJ and MATDENSE. However, I still have some problems. There is no problem when I run with a single core. But, MatGetFactor error occurs when using multi-core. Could you give me some advice? The error message is [0]PETSC ERROR: