Re: [petsc-users] Using dmplexdistribute do parallel FEM code.

2023-05-19 Thread neil liu
Thanks, Matt. Following your explanations, my understanding is this "If we use multiple MPI processors, the global numbering of the vertices (global domain) will be different from that with only one processor, right? ". If this is the case, will it be easy for us to check the assembled matrix

Re: [petsc-users] Error in building PETSc

2023-05-19 Thread Satish Balay via petsc-users
Use "make OMAKE_PRINTDIR=gmake all" instead of "make all" or use latest release Satish On Fri, 19 May 2023, Jau-Uei Chen wrote: > To whom it may concern, > > Currently, I am trying to build PETSc-3.17.4 on my own laptop (MacPro Late > 2019) but encounter an error when performing "make all".

Re: [petsc-users] Error in building PETSc

2023-05-19 Thread Jau-Uei Chen
Thanks for your prompt reply! It works perfectly. Best Regards, Jau-Uei Chen Graduate student Department of Aerospace Engineering and Engineering Mechanics The University of Texas at Austin On Fri, May 19, 2023 at 2:02 PM Satish Balay wrote: > Use "make OMAKE_PRINTDIR=gmake all" instead of