Re: [petsc-users] Scalable Solver for Incompressible Flow

2023-06-07 Thread Jed Brown
Alexander Lindsay  writes:

> This has been a great discussion to follow. Regarding
>
>> when time stepping, you have enough mass matrix that cheaper preconditioners 
>> are good enough
>
> I'm curious what some algebraic recommendations might be for high Re in
> transients. 

What mesh aspect ratio and streamline CFL number? Assuming your model is 
turbulent, can you say anything about momentum thickness Reynolds number Re_θ? 
What is your wall normal spacing in plus units? (Wall resolved or wall modeled?)

And to confirm, are you doing a nonlinearly implicit velocity-pressure solve?

> I've found one-level DD to be ineffective when applied monolithically or to 
> the momentum block of a split, as it scales with the mesh size. 

I wouldn't put too much weight on "scaling with mesh size" per se. You want an 
efficient solver for the coarsest mesh that delivers sufficient accuracy in 
your flow regime. Constants matter.

Refining the mesh while holding time steps constant changes the advective CFL 
number as well as cell Peclet/cell Reynolds numbers. A meaningful scaling study 
is to increase Reynolds number (e.g., by growing the domain) while keeping mesh 
size matched in terms of plus units in the viscous sublayer and Kolmogorov 
length in the outer boundary layer. That turns out to not be a very automatic 
study to do, but it's what matters and you can spend a lot of time chasing 
ghosts with naive scaling studies.


Re: [petsc-users] Initializing kokkos before petsc causes a problem

2023-06-07 Thread Junchao Zhang
Hi, Philip,
  Thanks for reporting. I will have a look at the issue.
--Junchao Zhang


On Wed, Jun 7, 2023 at 9:30 AM Fackler, Philip via petsc-users <
petsc-users@mcs.anl.gov> wrote:

> I'm encountering a problem in xolotl. We initialize kokkos before
> initializing petsc. Therefore...
>
> The pointer referenced here:
>
> https://gitlab.com/petsc/petsc/-/blob/main/src/vec/is/sf/impls/basic/kokkos/sfkok.kokkos.cxx#L363
> 
>
> from here:
> https://gitlab.com/petsc/petsc/-/blob/main/include/petsc_kokkos.hpp
>
> remains null because the code to initialize it is skipped here:
>
> https://gitlab.com/petsc/petsc/-/blob/main/src/sys/objects/kokkos/kinit.kokkos.cxx#L28
> See line 71.
>
> Can this be modified to allow for kokkos to have been initialized by the
> application before initializing petsc?
>
> Thank you for your help,
>
>
> *Philip Fackler *
> Research Software Engineer, Application Engineering Group
> Advanced Computing Systems Research Section
> Computer Science and Mathematics Division
> *Oak Ridge National Laboratory*
>


Re: [petsc-users] Scalable Solver for Incompressible Flow

2023-06-07 Thread Alexander Lindsay
This has been a great discussion to follow. Regarding

> when time stepping, you have enough mass matrix that cheaper
preconditioners are good enough

I'm curious what some algebraic recommendations might be for high Re in
transients. I've found one-level DD to be ineffective when applied
monolithically or to the momentum block of a split, as it scales with the
mesh size. For high Re boomeramg is ineffective perhaps until
https://gitlab.com/petsc/petsc/-/issues/1362 is resolved. I should try
fiddling around again with Pierre's work in HPDDM, but curious if there are
other PETSc PC recs, or if I need to overcome my inertia/laziness and move
beyond command line options.

On Mon, May 8, 2023 at 6:46 AM Jed Brown  wrote:

> Sebastian Blauth  writes:
>
> > Hello everyone,
> >
> > I wanted to briefly follow up on my question (see my last reply).
> > Does anyone know / have an idea why the LSC preconditioner in PETSc does
> > not seem to scale well with the problem size (the outer fgmres solver I
> > am using nearly scale nearly linearly with the problem size in my
> example).
>
> The implementation was tested on heterogeneous Stokes problems from
> geodynamics, and perhaps not on NS (or not with the discretization you're
> using).
>
> https://doi.org/10.1016/j.pepi.2008.07.036
>
> There is a comment about not having plumbing to provide a mass matrix. A
> few lines earlier there is code using PetscObjectQuery, and that same
> pattern could be applied for the mass matrix. If you're on a roughly
> uniform mesh, including the mass scaling will probably have little effect,
> but it could have a big impact in the presence of highly anistropic
> elements or a broad range of scales.
>
> I don't think LSC has gotten a lot of use since everyone I know who tried
> it has been sort of disappointed relative to other methods (e.g., inverse
> viscosity scaled mass matrix for heterogeneous Stokes, PCD for moderate Re
> Navier-Stokes). Of course there are no steady solutions to high Re so you
> either have a turbulence model or are time stepping. I'm not aware of work
> with LSC with turbulence models, and when time stepping, you have enough
> mass matrix that cheaper preconditioners are good enough. That said, it
> would be a great contribution to support this scaling.
>
> > I have also already tried using -ksp_diagonal_scale but the results are
> > identical.
>
> That's expected, and likely to mess up some MG implementations so I
> wouldn't recommend it.
>


[petsc-users] Interpolation Between DMSTAG Objects

2023-06-07 Thread Colton Bryant
Hello,

I am new to PETSc so apologies in advance if there is an easy answer to
this question I've overlooked.

I have a problem in which the computational domain is divided into two
overlapping regions (overset grids). I would like to discretize each region
as a separate DMSTAG object. What I do not understand is how to go about
interpolating a vector from say DM1 onto nodes in DM2. My current (likely
inefficient) idea is to create vectors of query points on DM2, share these
vectors among all processes, perform the interpolations on DM1, and then
insert the results into the vector on DM2.

Before I embark on manually setting up the communication here I wanted to
just ask if there is any native support for this kind of operation in PETSc
I may be missing.

Thanks in advance for any advice!

Best,
Colton Bryant


[petsc-users] PETSc :: FEM Help

2023-06-07 Thread Brandon Denton via petsc-users
Good Morning,

I'm trying to verify that the CAD -> PETSc/DMPlex methods I've developed
can be used for FEM analyses using PETSc. Attached is my current attempt
where I import a CAD STEP file to create a volumetric tetrahedral
discretization (DMPlex),  designate boundary condition points using
DMLabels, and solve the Laplace problem (heat) with Dirichlet conditions on
each end. At command line I indicate the STEP file with the -filename
option and the dual space degree with -petscspace_degree 2. The run ends
with either a SEGV Fault or a General MPI Communication Error.

Could you please look over the attached file to tell me if what I'm doing
to set up the FEM problem is wrong?

Thank you in advance for your time and help.
-Brandon

TYPICAL ERROR MESSAGE
[0]PETSC ERROR: - Error Message
--
[0]PETSC ERROR: General MPI error
[0]PETSC ERROR: MPI error 605109765 Invalid communicator, error stack:
PMPI_Comm_get_attr(344): MPI_Comm_get_attr(comm=0x0,
comm_keyval=-1539309568, attribute_val=0x7ffe75a58848, flag=0x7ffe75a58844)
failed
MPII_Comm_get_attr(257): MPIR_Comm_get_attr(comm=0x0,
comm_keyval=-1539309568, attribute_val=0x7ffe75a58848, flag=0x7ffe75a58844)
failed
MPII_Comm_get_attr(53).: Invalid communicator
[0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could
be the program crashed before they were used or a spelling mistake, etc!
[0]PETSC ERROR:   Option left: name:-dm_plex_refine_without_snap_to_geom
value: 0 source: command line
[0]PETSC ERROR:   Option left: name:-dm_refine value: 1 source: command line
[0]PETSC ERROR:   Option left: name:-snes_monitor (no value) source:
command line
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.18.5-1817-gd2497b8de4c
GIT Date: 2023-05-22 18:44:03 +
[0]PETSC ERROR: ./thermal on a  named XPS. by bdenton Wed Jun  7 11:03:43
2023
[0]PETSC ERROR: Configure options --with-make-np=16
--prefix=/mnt/c/Users/Brandon/software/libs/petsc/3.19.1-gitlab/gcc/11.2.0/mpich/3.4.2/openblas/0.3.17/opt
--with-debugging=false --COPTFLAGS="-O3 -mavx" --CXXOPTFLAGS="-O3 -mavx"
--FOPTFLAGS=-O3 --with-shared-libraries=1
--with-mpi-dir=/mnt/c/Users/Brandon/software/libs/mpich/3.4.2/gcc/11.2.0
--with-mumps=true --download-mumps=1 --with-metis=true --download-metis=1
--with-parmetis=true --download-parmetis=1 --with-superlu=true
--download-superlu=1 --with-superludir=true --download-superlu_dist=1
--with-blacs=true --download-blacs=1 --with-scalapack=true
--download-scalapack=1 --with-hypre=true --download-hypre=1
--with-hdf5-dir=/mnt/c/Users/Brandon/software/libs/hdf5/1.12.1/gcc/11.2.0
--with-valgrind-dir=/mnt/c/Users/Brandon/software/apps/valgrind/3.14.0
--with-blas-lib="[/mnt/c/Users/Brandon/software/libs/openblas/0.3.17/gcc/11.2.0/lib/libopenblas.so]"
--with-lapack-lib="[/mnt/c/Users/Brandon/software/libs/openblas/0.3.17/gcc/11.2.0/lib/libopenblas.so]"
--LDFLAGS= --with-tetgen=true --download-tetgen=1 --download-ctetgen=1
--download-opencascade=1 --download-egads
[0]PETSC ERROR: #1 PetscObjectName() at
/mnt/c/Users/Brandon/software/builddir/petsc-3.19.1-gitlab/src/sys/objects/pname.c:119
[0]PETSC ERROR: #2 PetscObjectGetName() at
/mnt/c/Users/Brandon/software/builddir/petsc-3.19.1-gitlab/src/sys/objects/pgname.c:27
[0]PETSC ERROR: #3 PetscDSAddBoundary() at
/mnt/c/Users/Brandon/software/builddir/petsc-3.19.1-gitlab/src/dm/dt/interface/dtds.c:3404
[0]PETSC ERROR: #4 DMAddBoundary() at
/mnt/c/Users/Brandon/software/builddir/petsc-3.19.1-gitlab/src/dm/interface/dm.c:7828
[0]PETSC ERROR: #5 main() at
/mnt/c/Users/Brandon/Documents/School/Dissertation/Software/EGADS-dev/thermal_v319/thermal_nozzle.c:173
[0]PETSC ERROR: PETSc Option Table entries:
[0]PETSC ERROR: -dm_plex_geom_print_model 1 (source: command line)
[0]PETSC ERROR: -dm_plex_geom_shape_opt 0 (source: command line)
[0]PETSC ERROR: -dm_plex_refine_without_snap_to_geom 0 (source: command
line)
[0]PETSC ERROR: -dm_refine 1 (source: command line)
[0]PETSC ERROR: -filename ./examples/Nozzle_example.stp (source: command
line)
[0]PETSC ERROR: -petscspace_degree 2 (source: command line)
[0]PETSC ERROR: -snes_monitor (source: command line)
[0]PETSC ERROR: End of Error Message ---send entire
error message to petsc-ma...@mcs.anl.gov--
application called MPI_Abort(MPI_COMM_SELF, 98) - process 0
[unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=98
:
system msg for write_line failure : Bad file descriptor
static const char help[] = "Test of FEA DMPlex w/CAD functionality";

#include 

/* User-defined Work Context  */
typedef struct {
PetscInt  dummy;
char filename[PETSC_MAX_PATH_LEN];// context containing CAD filename 
from command line
} AppCtx;


void f0_function(PetscInt dim, PetscInt Nf, PetscInt NfAux,
  const PetscInt uOff[], const PetscInt uOff_x[], const 

[petsc-users] Initializing kokkos before petsc causes a problem

2023-06-07 Thread Fackler, Philip via petsc-users
I'm encountering a problem in xolotl. We initialize kokkos before initializing 
petsc. Therefore...

The pointer referenced here:
https://gitlab.com/petsc/petsc/-/blob/main/src/vec/is/sf/impls/basic/kokkos/sfkok.kokkos.cxx#L363


from here:
https://gitlab.com/petsc/petsc/-/blob/main/include/petsc_kokkos.hpp

remains null because the code to initialize it is skipped here:
https://gitlab.com/petsc/petsc/-/blob/main/src/sys/objects/kokkos/kinit.kokkos.cxx#L28
See line 71.

Can this be modified to allow for kokkos to have been initialized by the 
application before initializing petsc?

Thank you for your help,

Philip Fackler
Research Software Engineer, Application Engineering Group
Advanced Computing Systems Research Section
Computer Science and Mathematics Division
Oak Ridge National Laboratory


Re: [petsc-users] Petsc using VecDuplicate in solution process

2023-06-07 Thread Barry Smith

  Can you please present the all output that callgrind is outputing to you that 
provides this indication.

> On Jun 7, 2023, at 4:11 AM, Pichler, Franz  wrote:
> 
> Hello thanks for the reply,
> 
> I created a working minimal example (as minimal as I can think of?) that I 
> include here, even though I am not sure which is the best format to do this, 
> I just add some plain text:
> //##
> #include 
> #include 
> #include 
> #include 
> #include 
> #include 
> #include  /*I "petscksp.h" I*/
> #include 
> #include 
> #include 
> #include 
>  
> class petsc_solver{
> Vec petsc_x, petsc_b;
> Mat petsc_A;
> KSP petsc_ksp;
> PC petsc_pc;
> int linear_iter;
> KSPConvergedReason reason;
> bool first_time;  
> int n_rows;
> int number_of_pc_rebuilds=0;
> public:
> petsc_solver() {
> KSPCreate(PETSC_COMM_WORLD, _ksp);
> KSPSetFromOptions(petsc_ksp);
> KSPGetPC(petsc_ksp, _pc);
> KSPSetType(petsc_ksp, KSPBCGS);
> PCSetType(petsc_pc, PCILU);
> PCFactorSetLevels(petsc_pc, 0);
> KSPSetInitialGuessNonzero(petsc_ksp, PETSC_TRUE);
> KSPSetTolerances(petsc_ksp, 1.e-12, 1.e-8, 1e14,1000);
> }
> void set_matrix(std::vector& dsp,std::vector& 
> col,std::vector& val){
> int *ptr_dsp = dsp.data();
> int *ptr_col = col.data();
> double *ptr_ele = val.data();
> n_rows=dsp.size()-1;
> std::cout<<"set petsc matrix, n_rows:"< MatCreateSeqAIJWithArrays(PETSC_COMM_WORLD, n_rows,n_rows, ptr_dsp, 
> ptr_col, ptr_ele,_A);
> MatSetOption(petsc_A, MAT_NEW_NONZERO_LOCATIONS, PETSC_FALSE);
> MatSetOption(petsc_A, MAT_NO_OFF_PROC_ZERO_ROWS, PETSC_TRUE);
> KSPSetOperators(petsc_ksp, petsc_A, petsc_A);
> }
> void set_rhs(std::vector& val){
> double *ptr_ele = val.data();
> VecCreateSeqWithArray(PETSC_COMM_WORLD, 1, n_rows, NULL,_b);
> VecPlaceArray(petsc_b, ptr_ele);
> }
> void set_sol(std::vector& val){
> double *ptr_ele = val.data();
> VecCreateSeqWithArray(PETSC_COMM_WORLD, 1, n_rows, NULL,_x);
> VecPlaceArray(petsc_x, ptr_ele);
> }
>  
> int solve(bool force_rebuild) {
> int solver_stat = 0;
> KSPGetPC(petsc_ksp, _pc);
> int ierr;
> // ierr = PetscObjectStateIncrease((PetscObject)petsc_A);
> // ierr = PetscObjectStateIncrease((PetscObject)petsc_b);
> MatAssemblyBegin(petsc_A,MAT_FINAL_ASSEMBLY);
> MatAssemblyEnd(petsc_A,MAT_FINAL_ASSEMBLY);
> VecAssemblyBegin(petsc_b);
> VecAssemblyEnd(petsc_b);
>  
> // KSPSetOperators(petsc_ksp, petsc_A, petsc_A);
> ierr = KSPSolve(petsc_ksp, petsc_b, petsc_x);
> KSPGetConvergedReason(petsc_ksp, );
> if (reason < 0){
> KSPGetIterationNumber(petsc_ksp, _iter);
> std::cout<<"NOT CONVERGED!\n";
> // PetscPrintf(PETSC_COMM_WORLD,"KSPConvergedReason _aux: %D  
> PCREUSE: %D (%D=False %D=True) IERR:%i ITERS:%i\n",reason, reuse, 
> PETSC_FALSE, PETSC_TRUE, ierr,linear_iter);
> return -1;
> }
> KSPGetIterationNumber(petsc_ksp, _iter);
> return linear_iter;
> }
> };
> void change_rhs(int i, int n_rows,std::vector){
> for(int 
> row=0;row random something
> }
> void change_matrix(int i, int n_rows,std::vector& vals){
> int nnz= n_rows*3-2;
> for(int row=0;row if(row==0) {
> vals[0]=3+cos(i+row);//pseduo random something
> vals[1]=-1+0.3*cos(i+row);//pseduo random something
> }else if(row==n_rows-1){
> vals[nnz-1]=3+cos(i+row);//pseduo random something
> vals[nnz-2]=-1+0.3*cos(i+row);//pseduo random something
> }else{
> vals[2+(row-1)*3]   =-1+0.1*cos(i+row);//pseduo random something
> vals[2+(row-1)*3+1] = 4+0.3*cos(i+row);//pseduo random something
> vals[3+(row-1)*3+2] =-1+0.2*cos(i+row);//pseduo random something
> }
> }
> }
> void set_crs_structure(int n_rows,std::vector& dsp,std::vector& 
> col,std::vector& val ){
> int nnz= n_rows*3-2;
> std::cout<<"SETCRS ROWS:"< dsp.resize(n_rows+1);
> col.resize(nnz);
> val.resize(nnz);
> for(int row=0;row if(row==0) {
> col[0]=0;
> col[1]=1;
> dsp[row+1]=dsp[row]+2;
> }else if(row==n_rows-1){
> col[2+(row-1)*3+0]=row-1;
> col[2+(row-1)*3+1]=row;
> dsp[row+1]=dsp[row]+2;
> }
> else{
> dsp[row+1]=dsp[row]+3;
> col[2+(row-1)*3+0]=row-1;
> col[2+(row-1)*3+1]=row;
> col[2+(row-1)*3+2]=row+1;
> }
> }
> }
> double check_res(std::vector& dsp,std::vector& 
> col,std::vector& val ,std::vector& sol,std::vector 
> rhs){
> int 

Re: [petsc-users] Petsc using VecDuplicate in solution process

2023-06-07 Thread Pichler, Franz
Hello thanks for the reply,

I created a working minimal example (as minimal as I can think of?) that I 
include here, even though I am not sure which is the best format to do this, I 
just add some plain text:
//##
#include 
#include 
#include 
#include 
#include 
#include 
#include  /*I "petscksp.h" I*/
#include 
#include 
#include 
#include 

class petsc_solver{
Vec petsc_x, petsc_b;
Mat petsc_A;
KSP petsc_ksp;
PC petsc_pc;
int linear_iter;
KSPConvergedReason reason;
bool first_time;
int n_rows;
int number_of_pc_rebuilds=0;
public:
petsc_solver() {
KSPCreate(PETSC_COMM_WORLD, _ksp);
KSPSetFromOptions(petsc_ksp);
KSPGetPC(petsc_ksp, _pc);
KSPSetType(petsc_ksp, KSPBCGS);
PCSetType(petsc_pc, PCILU);
PCFactorSetLevels(petsc_pc, 0);
KSPSetInitialGuessNonzero(petsc_ksp, PETSC_TRUE);
KSPSetTolerances(petsc_ksp, 1.e-12, 1.e-8, 1e14,1000);
}
void set_matrix(std::vector& dsp,std::vector& 
col,std::vector& val){
int *ptr_dsp = dsp.data();
int *ptr_col = col.data();
double *ptr_ele = val.data();
n_rows=dsp.size()-1;
std::cout<<"set petsc matrix, n_rows:"<& val){
double *ptr_ele = val.data();
VecCreateSeqWithArray(PETSC_COMM_WORLD, 1, n_rows, NULL,_b);
VecPlaceArray(petsc_b, ptr_ele);
}
void set_sol(std::vector& val){
double *ptr_ele = val.data();
VecCreateSeqWithArray(PETSC_COMM_WORLD, 1, n_rows, NULL,_x);
VecPlaceArray(petsc_x, ptr_ele);
}

int solve(bool force_rebuild) {
int solver_stat = 0;
KSPGetPC(petsc_ksp, _pc);
int ierr;
// ierr = PetscObjectStateIncrease((PetscObject)petsc_A);
// ierr = PetscObjectStateIncrease((PetscObject)petsc_b);
MatAssemblyBegin(petsc_A,MAT_FINAL_ASSEMBLY);
MatAssemblyEnd(petsc_A,MAT_FINAL_ASSEMBLY);
VecAssemblyBegin(petsc_b);
VecAssemblyEnd(petsc_b);

// KSPSetOperators(petsc_ksp, petsc_A, petsc_A);
ierr = KSPSolve(petsc_ksp, petsc_b, petsc_x);
KSPGetConvergedReason(petsc_ksp, );
if (reason < 0){
KSPGetIterationNumber(petsc_ksp, _iter);
std::cout<<"NOT CONVERGED!\n";
// PetscPrintf(PETSC_COMM_WORLD,"KSPConvergedReason _aux: %D  
PCREUSE: %D (%D=False %D=True) IERR:%i ITERS:%i\n",reason, reuse, PETSC_FALSE, 
PETSC_TRUE, ierr,linear_iter);
return -1;
}
KSPGetIterationNumber(petsc_ksp, _iter);
return linear_iter;
}
};
void change_rhs(int i, int n_rows,std::vector){
for(int 
row=0;row& vals){
int nnz= n_rows*3-2;
for(int row=0;row& dsp,std::vector& 
col,std::vector& val ){
int nnz= n_rows*3-2;
std::cout<<"SETCRS ROWS:"<& dsp,std::vector& 
col,std::vector& val ,std::vector& sol,std::vector rhs){
int n_rows=dsp.size()-1;
double res=0;
for (int row=0;row rhs(n_rows);
std::vector sol(n_rows);
std::vector dsp;
std::vector cols;
std::vector vals;
set_crs_structure(n_rows,dsp,cols,vals);
PetscInitializeNoArguments();
petsc_solver p;
p.set_matrix(dsp,cols,vals);
p.set_rhs(rhs);
p.set_sol(sol);
for (int i=0;i<100;i++){
change_rhs(i,n_rows,rhs);
change_matrix(i,n_rows,vals);
// std::cout<<"RES BEFORE:"<
Sent: Tuesday, June 6, 2023 4:40 PM
To: Pichler, Franz 
Cc: petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] Petsc using VecDuplicate in solution process



Il giorno mar 6 giu 2023 alle ore 09:24 Pichler, Franz 
mailto:franz.pich...@v2c2.at>> ha scritto:
Hello,
I was just investigating my KSP_Solve_BCGS Routine with algrandcallgrind,
I see there that petsc is using a vecduplicate (onvolvin malloc and copy) every 
time it is called.

Do you mean KSPSolve_BCGS?

There's only one VecDuplicate in there and it is called only once. An example 
code showing the problem would help



I call it several thousand times (time evolution problem with rather small 
matrices)

I am not quite sure which vector is copied there but I guess is the initial 
guess or the rhs,
Is there a tool in petsc to avoid any vecduplication by providing a fixed 
memory for this vector?

Some corner facts of my routine:
I assemble the matrices(crs,serial) and vectors myself and then use
MatCreateSeqAIJWithArrays and VecCreateSeqWithArray
To wrap petsc around it,

I use a ILU preconditioner and the sparsity patterns between the calls to not 
change, the values do,

Thank you for any hint how to avoid the vecduplicate,

Best regards

Franz


Dr. Franz Pichler
Lead Researcher Area E


Virtual Vehicle Research GmbH

Inffeldgasse 21a, 8010 Graz, Austria
Phone: +43 316 873 9818
franz.pich...@v2c2.at
www.v2c2.at

Firmenname: Virtual Vehicle Research GmbH
Rechtsform: Gesellschaft mit beschränkter