Re: [petsc-users] Gmsh 8-noded quadrilateral

2022-02-11 Thread Matthew Knepley
Jed is right about the numerics. However, this does not look hard. Here is my try at it: https://gitlab.com/petsc/petsc/-/merge_requests/4838 Please tell me if this works and I will make a test and merge. Thanks, Matt On Thu, Feb 10, 2022 at 6:47 PM Jed Brown wrote: > Susanne, do

Re: [petsc-users] GAMG crash during setup when using multiple GPUs

2022-02-11 Thread Sajid Ali Syed
Hi Mark, Thanks for the information. @Junchao: Given that there are known issues with GPU aware MPI, it might be best to wait until there is an updated version of cray-mpich (which hopefully contains the relevant fixes). Thank You, Sajid Ali (he/him) | Research Associate Scientific Computing

Re: [petsc-users] Gmsh 8-noded quadrilateral

2022-02-11 Thread Jed Brown
Sounds good. Note that if you use direct solvers, that extra node is basically free because the vertex separators are unchanged. It's a marginal cost in the storage of assembled matrices and the length of state vectors. And in 3D, even less significant. Susanne Claus writes: > Dear Matthew

Re: [petsc-users] Gmsh 8-noded quadrilateral

2022-02-11 Thread Susanne Claus
Dear Matthew and Jed, Brilliant. Thank you so much! Your changes work like a charm Matthew (I tested your branch on the gmsh file I sent) and thank you so much for your advice Jed. The loss of one order of convergence for an inf-sup stable pressure discretization seems indeed a very high