Re: [petsc-users] About PC hpddm

2024-05-09 Thread Pierre Jolivet
> On 9 May 2024, at 6:31 AM, Ng, Cho-Kuen via petsc-users > wrote: > > This Message Is From an External Sender > This message came from outside your organization. > I used spack to install petsc with hpddm as follows. > > o spack install petsc+hpddm > o spack install slepc ^petsc+hpddm > >

Re: [petsc-users] Reasons for breakdown in preconditioned LSQR

2024-05-07 Thread Pierre Jolivet
down from my side? > > > Best regards, > > Marco > > - Original Message - > >> From: Pierre Jolivet mailto:pie...@joliv.et>> > >> To: Marco Seiz mailto:ma...@kit.ac.jp>> > >> Cc: petsc-users@mcs.anl.gov <mailto:petsc-users@mcs.an

Re: [petsc-users] Reasons for breakdown in preconditioned LSQR

2024-05-07 Thread Pierre Jolivet
> On 7 May 2024, at 9:10 AM, Marco Seiz wrote: > > Thanks for the quick response! > > On 07.05.24 14:24, Pierre Jolivet wrote: >> >> >>> On 7 May 2024, at 7:04 AM, Marco Seiz wrote: >>> >>> This Message Is From an External Sende

Re: [petsc-users] Reasons for breakdown in preconditioned LSQR

2024-05-06 Thread Pierre Jolivet
> On 7 May 2024, at 7:04 AM, Marco Seiz wrote: > > This Message Is From an External Sender > This message came from outside your organization. > Hello, > > something a bit different from my last question, since that didn't > progress so well: > I have a related model which generally produces

Re: [petsc-users] PETSc options

2024-05-06 Thread Pierre Jolivet
> On 6 May 2024, at 3:14 PM, Matthew Knepley wrote: > > This Message Is From an External Sender > This message came from outside your organization. > On Mon, May 6, 2024 at 1:04 AM Adrian Croucher > wrote: >> This Message Is From an External Sender >> This

Re: [petsc-users] Parallelism of the Mat.convert() function

2024-04-23 Thread Pierre Jolivet
The code is behaving as it should, IMHO. Here is a way to have the Mat stored the same independently of the number of processes. […] global_rows, global_cols = input_array.T.shape size = ((None, global_rows), (0 if COMM_WORLD.Get_rank() < COMM_WORLD.Get_size() - 1 else global_cols, global_cols))

Re: [petsc-users] MatCreateTranspose

2024-04-12 Thread Pierre Jolivet
> On 12 Apr 2024, at 11:51 AM, Carl-Johan Thore > wrote: > > On Fri, Apr 12, 2024 at 11:16 AM Pierre Jolivet <mailto:pie...@joliv.et>> wrote: >> >> >>> On 12 Apr 2024, at 11:10 AM, Carl-Johan Thore >> <mailto:carljohanth...@gmail.com>&g

Re: [petsc-users] MatCreateTranspose

2024-04-12 Thread Pierre Jolivet
10, 2024 at 8:24 AM Carl-Johan Thore <mailto:carljohanth...@gmail.com>> wrote: >> >> >> On Tue, Apr 9, 2024 at 5:31 PM Pierre Jolivet > <mailto:pie...@joliv.et>> wrote: >>> >>>> On 9 Apr 2024, at 4:19 PM, Carl-Johan Thore >>>

Re: [petsc-users] MatCreateTranspose

2024-04-09 Thread Pierre Jolivet
> On 9 Apr 2024, at 4:19 PM, Carl-Johan Thore wrote: > > This Message Is From an External Sender > This message came from outside your organization. > Thanks for the suggestion. I don't have a factored matrix (and can't really > use direct linear solvers) so MatSolveTranspose doesn't seem to

Re: [petsc-users] Install PETSc with option `--with-shared-libraries=1` failed on MacOS

2024-03-18 Thread Pierre Jolivet
> On 18 Mar 2024, at 7:59 PM, Satish Balay via petsc-users > wrote: > > On Mon, 18 Mar 2024, Satish Balay via petsc-users wrote: > >> On Mon, 18 Mar 2024, Pierre Jolivet wrote: >> >>> >>> >>>> On 18 Mar 2024, at 5:13 PM,

Re: [petsc-users] Install PETSc with option `--with-shared-libraries=1` failed on MacOS

2024-03-18 Thread Pierre Jolivet
LOC=2048 -Wall -DF_INTERFACE_GFORT -fPIC -DNO_WARMUP >>>> -DMAX_CPU_NUMBER=24 -DMAX_PARALLEL_NUMBER=1 -DBUILD_SINGLE=1 >>>> -DBUILD_DOUBLE=1 -DBUILD_COMPLEX=1 -DBUILD_COMPLEX16=1 >>>> -DVERSION=\"0.3.21\" -march=armv8-a -UASMNAME -UASMFNAME -UNAME -UCNAME >>>> -UCHAR_NAME -UCHAR_CNAME -DASMNAME=_lapack_wrappers >>>&

Re: [petsc-users] Install PETSc with option `--with-shared-libraries=1` failed on MacOS

2024-03-17 Thread Pierre Jolivet
> ^ > 4 errors generated. > ``` > > Best wishes, > Zongze > > > >> On 17 Mar 2024, at 18:48, Pierre

Re: [petsc-users] Install PETSc with option `--with-shared-libraries=1` failed on MacOS

2024-03-17 Thread Pierre Jolivet
You need this MR https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/merge_requests/7365__;!!G_uCfscf7eWS!ZkCTTWYisvuRPh4YfLXoXTv2cY0NTXiSQhpIXFilhL3cIcymTpMi-q4OnEYp5mh85b9wUGFXUa55vdINNmkvfQ$ main has been broken for macOS since

Re: [petsc-users] Help Needed Debugging Installation Issue for PETSc with SLEPc

2024-03-15 Thread Pierre Jolivet
This was fixed 5 days ago in https://urldefense.us/v3/__https://gitlab.com/slepc/slepc/-/merge_requests/638__;!!G_uCfscf7eWS!Z6nHLo3qh8bAe4zTIkJHhBcLc2WyVfPb47nBF38kwu8pyoNFz11wVdiL2ytUg66oHoVPkjBafopf_Gag8j6dXQ$ , so you need to use an up-to-date release branch of SLEPc. Thanks, Pierre > On

Re: [petsc-users] Help with SLEPc eigenvectors convergence.

2024-03-06 Thread Pierre Jolivet
It seems your A is rank-deficient. If you slightly regularize the GEVP, e.g., -st_target 1.0E-6, you’ll get errors closer to 0. Thanks, Pierre > On 6 Mar 2024, at 8:57 PM, Eric Chamberland via petsc-users > wrote: > > This Message Is From an External Sender > This message came from outside

Re: [petsc-users] MUMPS Metis options

2024-03-05 Thread Pierre Jolivet
> On 5 Mar 2024, at 2:20 PM, Michal Habera wrote: > > This Message Is From an External Sender > This message came from outside your organization. > Dear all, > > MUMPS allows custom configuration of the METIS library which it uses > for symmetric permutations using "mumps_par%METIS OPTIONS", >

Re: [petsc-users] PAMI error on Summit

2024-02-29 Thread Pierre Jolivet
> On 29 Feb 2024, at 5:06 PM, Matthew Knepley wrote: > > This Message Is From an External Sender > This message came from outside your organization. > On Thu, Feb 29, 2024 at 11:03 AM Blondel, Sophie via petsc-users > mailto:petsc-users@mcs.anl.gov>> wrote: >> This Message Is From an External

Re: [petsc-users] question on PCLSC with matrix blocks of type MATNEST

2024-02-13 Thread Pierre Jolivet
TSC ERROR: Unspecified symbolic phase for product AB with A nest, B > nest. Call MatProductSetFromOptions() first or the product is not supported > > If you agree with this fix, I'll create a MR for it. > Hong > > > From: Pierre Jolivet > Sent: Tuesday, February 13, 2024 12:

Re: [petsc-users] question on PCLSC with matrix blocks of type MATNEST

2024-02-12 Thread Pierre Jolivet
his object type [0]PETSC ERROR: MatProduct AB not supported for nest and nest [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. Thanks, Pierre > Hong > > From: Pierre Jolivet > Sent: Sunday, February 11, 2024 7:43 AM > To: Zhang, Hong > Cc: Hana Honnerová ; p

Re: [petsc-users] question on PCLSC with matrix blocks of type MATNEST

2024-02-11 Thread Pierre Jolivet
> On 8 Feb 2024, at 5:37 PM, Zhang, Hong via petsc-users > wrote: > > Hana, > "product AB with A nest, B nest" is not supported by PETSc. I do not know why > PETSc does not display such an error message. I'll check it. Did you? A naive fix is to simply add the missing PetscCheck() in

Re: [petsc-users] PETSc init question

2024-01-31 Thread Pierre Jolivet
> On 31 Jan 2024, at 11:31 AM, Alain O' Miniussi wrote: > > Hi, > > It is indicated in: > https://petsc.org/release/manualpages/Sys/PetscInitialize/ > that the init function will call MPI_Init. > > What if MPI_Init was already called (as it is the case in my application) and > what about

Re: [petsc-users] Bug in VecNorm, 3.20.3

2024-01-26 Thread Pierre Jolivet
> On 26 Jan 2024, at 3:11 PM, Pierre Jolivet wrote: > >> >> On 26 Jan 2024, at 3:03 PM, mich...@paraffinalia.co.uk wrote: >> >> On 2024-01-23 18:09, Junchao Zhang wrote: >>> Do you have an example to reproduce it? >>> --Junchao Zhang >> &

Re: [petsc-users] Bug in VecNorm, 3.20.3

2024-01-26 Thread Pierre Jolivet
> On 26 Jan 2024, at 3:03 PM, mich...@paraffinalia.co.uk wrote: > > On 2024-01-23 18:09, Junchao Zhang wrote: >> Do you have an example to reproduce it? >> --Junchao Zhang > > I have put a minimum example on github: > > https://github.com/mjcarley/petsc-test > > It does seem that the problem

Re: [petsc-users] Hypre BoomerAMG settings options database

2024-01-06 Thread Pierre Jolivet
> On 6 Jan 2024, at 3:15 PM, Mark Adams wrote: > > Does this work for you? > -pc_hypre_boomeramg_grid_sweeps_all 2 > The comment in our code says SSOR is the default but it looks like it is > really "hSGS" > I thought it was an L1 Jacobi, but you would want to ask Hypre about this. HYPRE’s

Re: [petsc-users] Matvecs and KSPSolves with multiple vectors

2023-12-21 Thread Pierre Jolivet
ls/dense/seq/cupm/matseqdensecupm.hpp?ref_type=heads#L368 >> >> Thanks, >> >> Matt >> >>> Thanks, >>> Sreeram >>> >>> On Sat, Dec 16, 2023 at 10:56 PM Pierre Jolivet >> <mailto:pie...@joliv.et>> wrote: >

Re: [petsc-users] Matvecs and KSPSolves with multiple vectors

2023-12-19 Thread Pierre Jolivet
l hopefully give you a more insightful answer. Thanks, Pierre > Thanks, > Sreeram > > On Sat, Dec 16, 2023 at 10:56 PM Pierre Jolivet <mailto:pie...@joliv.et>> wrote: >> Unfortunately, I am not able to reproduce such a failure with your input >> matrix. >>

Re: [petsc-users] Matvecs and KSPSolves with multiple vectors

2023-12-16 Thread Pierre Jolivet
different options and also try to get the MWE made (this > KSPMatSolve is pretty performance critical for us). > > Thanks for all your help, > Sreeram > > On Fri, Dec 15, 2023 at 1:01 AM Pierre Jolivet <mailto:pie...@joliv.et>> wrote: >> >>> On 14 Dec

Re: [petsc-users] Matvecs and KSPSolves with multiple vectors

2023-12-14 Thread Pierre Jolivet
f the KSPMatSolve() is performance-critical for you (see point b) from above). Thanks, Pierre > Thanks, > Sreeram > > On Thu, Dec 14, 2023, 1:12 PM Pierre Jolivet <mailto:pie...@joliv.et>> wrote: >> >> >>> On 14 Dec 2023, at 8:02 PM, Sreeram R V

Re: [petsc-users] Matvecs and KSPSolves with multiple vectors

2023-12-14 Thread Pierre Jolivet
> On Thu, Dec 14, 2023 at 12:42 AM Pierre Jolivet <mailto:pie...@joliv.et>> wrote: >> Hello Sreeram, >> KSPCG (PETSc implementation of CG) does not handle solves with multiple >> columns at once. >> There is only a single native PETSc KSP implementation which

Re: [petsc-users] Some question about compiling c++ program including PETSc using cmake

2023-12-13 Thread Pierre Jolivet
> On 14 Dec 2023, at 4:13 AM, 291--- via petsc-users > wrote: > > Dear SLEPc Developers, > > I a am student from Tongji University. Recently I am trying to write a c++ > program for matrix solving, which requires importing the PETSc library that > you have developed. However a lot of

Re: [petsc-users] Matvecs and KSPSolves with multiple vectors

2023-12-13 Thread Pierre Jolivet
will try it out and >>>>> see how it performs. >>>> >>>> Just FYI, AMGX does not handle systems with multiple RHS, and thus has no >>>> PCMatApply() implementation. >>>> BoomerAMG does, and there is a PCMatApply_HYPRE_BoomerAMG()

Re: [petsc-users] Bug report VecNorm

2023-12-10 Thread Pierre Jolivet
> On 10 Dec 2023, at 8:40 AM, Stephan Köhler > wrote: > > Dear PETSc/Tao team, > > there is a bug in the voector interface: In the function > VecNorm, see, eg. > https://petsc.org/release/src/vec/vec/interface/rvector.c.html#VecNorm line > 197 the check for consistency in line 214 is

Re: [petsc-users] Matvecs and KSPSolves with multiple vectors

2023-12-07 Thread Pierre Jolivet
iple RHS, and thus has no PCMatApply() implementation. BoomerAMG does, and there is a PCMatApply_HYPRE_BoomerAMG() implementation. But let us know if you need assistance figuring things out. Thanks, Pierre > Thanks, > Sreeram > > On Thu, Dec 7, 2023 at 2:02 PM Pierre Jolivet &

Re: [petsc-users] Matvecs and KSPSolves with multiple vectors

2023-12-07 Thread Pierre Jolivet
To expand on Barry’s answer, we have observed repeatedly that MatMatMult with MatAIJ performs better than MatMult with MatMAIJ, you can reproduce this on your own with https://petsc.org/release/src/mat/tests/ex237.c.html. Also, I’m guessing you are using some sort of preconditioner within your

Re: [petsc-users] Error using Metis with PETSc installed with MUMPS

2023-11-07 Thread Pierre Jolivet
ot;metis.h" > > int main() { > #if (IDXTYPEWIDTH != 32) > #error incompatible IDXTYPEWIDTH > #endif; > return 0; > } > > > How could I proceed? I would use --download-metis and then have your code use METIS from PETSc, not the other way around. Thanks

Re: [petsc-users] Error using Metis with PETSc installed with MUMPS

2023-11-03 Thread Pierre Jolivet
should I configure PETSc linking ParMetis to the same library used by > my main code? Yes. Thanks, Pierre > Thanks, > Victoria > > Il giorno gio 2 nov 2023 alle ore 09:35 Pierre Jolivet <mailto:pie...@joliv.et>> ha scritto: >> >>> On 2 Nov 2023, at

Re: [petsc-users] Error using Metis with PETSc installed with MUMPS

2023-11-02 Thread Pierre Jolivet
ion is > causing the crash. > Abort(59) on node 0 (rank 0 in comm 0): application called > MPI_Abort(MPI_COMM_WORLD, 59) - process 0 > > > Thanks, > Victoria > > Il giorno mer 1 nov 2023 alle ore 10:33 Pierre Jolivet <mailto:pie...@joliv.et>> ha scritto: &

Re: [petsc-users] Error using Metis with PETSc installed with MUMPS

2023-11-02 Thread Pierre Jolivet
atible with the given > matrix state. But I could easily be wrong. > > Barry > > >> On Nov 1, 2023, at 1:33 PM, Pierre Jolivet wrote: >> >> Victoria, please keep the list in copy. >> >>> I am not understanding how can I switch to ParMeti

Re: [petsc-users] Error using Metis with PETSc installed with MUMPS

2023-11-01 Thread Pierre Jolivet
igger the confusing warning message from MUMPS? > > Barry > >> On Nov 1, 2023, at 12:17 PM, Pierre Jolivet wrote: >> >> >> >>> On 1 Nov 2023, at 3:33 PM, Zhang, Hong via petsc-users >>> wrote: >>> >>> Victoria,

Re: [petsc-users] Error using Metis with PETSc installed with MUMPS

2023-11-01 Thread Pierre Jolivet
> On 1 Nov 2023, at 3:33 PM, Zhang, Hong via petsc-users > wrote: > > Victoria, > "** Maximum transversal (ICNTL(6)) not allowed because matrix is distributed > Ordering based on METIS" This warning is benign and appears for every run using a sequential partitioner in MUMPS with a

Re: [petsc-users] Galerkin projection using petsc4py

2023-10-11 Thread Pierre Jolivet
Thanks, Pierre > Thanks again, > Thanos > >> On 11 Oct 2023, at 09:04, Pierre Jolivet wrote: >> >> That’s because: >> size = ((None, global_rows), (global_cols, global_cols)) >> should be: >> size = ((None, global_rows), (None, global_cols)) >

Re: [petsc-users] Galerkin projection using petsc4py

2023-10-11 Thread Pierre Jolivet
src/mat/interface/matrix.c:9896 > [2] MatProductSetFromOptions() at > /Users/boutsitron/firedrake/src/petsc/src/mat/interface/matproduct.c:541 > [2] MatProductSetFromOptions_Private() at > /Users/boutsitron/firedrake/src/petsc/src/mat/interface/matproduct.c:435 > [2] MatProductSe

Re: [petsc-users] Galerkin projection using petsc4py

2023-10-10 Thread Pierre Jolivet
ws [0, 34) >>> Rank 1: Rows [34, 67) >>> Rank 2: Rows [67, 100) >>> >>> Traceback (most recent call last): >>> File "/Users/boutsitron/work/galerkin_projection.py", line 87, in >>> A_prime = A.ptap(Phi) >>>

Re: [petsc-users] Galerkin projection using petsc4py

2023-10-05 Thread Pierre Jolivet
False) > > # Create an empty PETSc matrix object to store the result of the PtAP > operation. > # This will hold the result A' = Phi.T * A * Phi after the computation. > A_prime = create_petsc_matrix(np.zeros((k, k)), sparse=False) > > # Perform the PtAP (Phi Transpose times A

Re: [petsc-users] Galerkin projection using petsc4py

2023-10-05 Thread Pierre Jolivet
How about using ptap which will use MatPtAP? It will be more efficient (and it will help you bypass the issue). Thanks, Pierre > On 5 Oct 2023, at 1:18 PM, Thanasis Boutsikakis > wrote: > > Sorry, forgot function create_petsc_matrix() > > def create_petsc_matrix(input_array sparse=True): >

Re: [petsc-users] Error when configure cmake

2023-09-29 Thread Pierre Jolivet
You are using g++ (GCC) 4.4.7 20120313 (Red Hat 4.4.7-23). You need to use a less ancient C++ compiler. Please send logs to petsc-maint, not petsc-users. Thanks, Pierre > On 30 Sep 2023, at 7:00 AM, Ivan Luthfi wrote: > > I am trying to configure my pets-3.13.6, but I have an error when

Re: [petsc-users] [petsc-maint] PETSc with Xcode 15

2023-09-21 Thread Pierre Jolivet
This is enough to bypass most of the warnings.There is still the "ld: warning: ignoring duplicate libraries:", but I think these should be filtered (unlike the other warnings which should be fixed).Thanks,Pierrediff --git a/config/BuildSystem/config/setCompilers.py

Re: [petsc-users] PCFIELDSPLIT with MATSBAIJ

2023-09-06 Thread Pierre Jolivet
IS, and RHS Vec, feel free to send them at petsc-ma...@mcs.anl.gov, I can have a look."Yes, this is reproducible on smaller problems. In case I send you Mat, IS, and RHS, which format is preferable?Kind regards,Carl-JohanOn Wed, Sep 6, 2023 at 8:21 AM Pierre Jolivet <pierre.joli...@

Re: [petsc-users] PCFIELDSPLIT with MATSBAIJ

2023-09-06 Thread Pierre Jolivet
erence, as long as you tell me the scalar type and whether you are using 32- or 64-bit indices. Thanks, Pierre > Kind regards, > Carl-Johan > > > On Wed, Sep 6, 2023 at 8:21 AM Pierre Jolivet wrote: >> Naïve question, but is your problem really symmetric? >> For

Re: [petsc-users] PCFIELDSPLIT with MATSBAIJ

2023-09-06 Thread Pierre Jolivet
ons? > /Carl-Johan > >> On Mon, Sep 4, 2023 at 9:31 PM Pierre Jolivet wrote: >> The branch should now be good to go >> (https://gitlab.com/petsc/petsc/-/merge_requests/6841). >> Sorry, I made a mistake before, hence the error on PetscObjectQuery(). >> I’m no

Re: [petsc-users] PCFIELDSPLIT with MATSBAIJ

2023-09-04 Thread Pierre Jolivet
processes type: mpiaij rows=15828, cols=15828 total: nonzeros=407340, allocated nonzeros=407340 total number of mallocs used during MatSetValues calls=0 using I-node (on process 0) routines: found 965 nodes, limit used is 5 On 28 Aug 2023, at 12:12 PM, Pierre Jolivet wrote:On 28

Re: [petsc-users] PCFIELDSPLIT with MATSBAIJ

2023-08-28 Thread Pierre Jolivet
t; -dm_mat_type sbaij > > > Kind regards, > Carl-Johan > > > On Sat, Aug 26, 2023 at 5:21 PM Pierre Jolivet via petsc-users > mailto:petsc-users@mcs.anl.gov>> wrote: >> >> >>> On 27 Aug 2023, at 12:14 AM, Carl-Johan Thore >> <mai

Re: [petsc-users] PCFIELDSPLIT with MATSBAIJ

2023-08-26 Thread Pierre Jolivet via petsc-users
/merge_requests/6841. I need to add a new code path in MatCreateRedundantMatrix() to make sure the resulting Mat is indeed SBAIJ, but that is orthogonal to the PCFIELDSPLIT issue. The branch should be usable in its current state. Thanks, Pierre > > From: Pierre Jolivet > Sent: Saturday, August 26,

Re: [petsc-users] PCFIELDSPLIT with MATSBAIJ

2023-08-26 Thread Pierre Jolivet
> On 26 Aug 2023, at 11:16 PM, Carl-Johan Thore wrote: > > "(Sadly) MATSBAIJ is extremely broken, in particular, it cannot be used to > retrieve rectangular blocks in MatCreateSubMatrices, thus you cannot get the > A01 and A10 blocks in PCFIELDSPLIT. > I have a branch that fixes this, but I

Re: [petsc-users] PCFIELDSPLIT with MATSBAIJ

2023-08-26 Thread Pierre Jolivet
(Sadly) MATSBAIJ is extremely broken, in particular, it cannot be used to retrieve rectangular blocks in MatCreateSubMatrices, thus you cannot get the A01 and A10 blocks in PCFIELDSPLIT. I have a branch that fixes this, but I haven’t rebased in a while (and I’m AFK right now), would you want me

Re: [petsc-users] Multiplication of partitioned with non-partitioned (sparse) PETSc matrices

2023-08-23 Thread Pierre Jolivet
rows, cols) > > # Assembly the matrix to compute the final structure > local_A.assemblyBegin() > local_A.assemblyEnd() > > Print(local_A.getType()) > Print(local_A.getSizes()) > > # pdb.set_trace() > > # Multiply the two matrices > local_C = local_A.matMult(B_se

Re: [petsc-users] Multiplication of partitioned with non-partitioned (sparse) PETSc matrices

2023-08-23 Thread Pierre Jolivet
 > On 23 Aug 2023, at 5:35 PM, Thanasis Boutsikakis > wrote: > Hi all, > > I am trying to multiply two Petsc matrices as C = A * B, where A is a tall > matrix and B is a relatively small matrix. > > I have taken the decision to create A as (row-)partitioned matrix and B as a >

Re: [petsc-users] eigenvalue problem involving inverse of a matrix

2023-08-14 Thread Pierre Jolivet
> On 14 Aug 2023, at 10:39 AM, maitri ksh wrote: > >  > Hi, > I need to solve an eigenvalue problem Ax=lmbda*x, where A=(B^-H)*Q*B^-1 is a > hermitian matrix, 'B^-H' refers to the hermitian of the inverse of the matrix > B. Theoretically it would take around 1.8TB to explicitly compute

Re: [petsc-users] performance regression with GAMG

2023-08-10 Thread Pierre Jolivet
> On 11 Aug 2023, at 1:14 AM, Mark Adams wrote: > > BTW, nice bug report ... >> >> So in the first step it coarsens from 150e6 to 5.4e6 DOFs instead of to >> 2.6e6 DOFs. > > Yes, this is the critical place to see what is different and going wrong. > > My 3D tests were not that different

Re: [petsc-users] compiler related error (configuring Petsc)

2023-08-01 Thread Pierre Jolivet
Right, so configure did the proper job and told you that your compiler does not (fully) work with C++11, there is no point in trying to add extra flags to bypass this limitation. As Satish suggested: either use a newer g++ or configure --with-cxx=0 Thanks, Pierre > On 2 Aug 2023, at 6:42 AM,

Re: [petsc-users] MUMPS Error 'INFOG(1)=-3 INFO(2)=3' (SPARSE MATRIX INVERSE)

2023-07-27 Thread Pierre Jolivet
MUMPS errors are documented in section 8 of https://mumps-solver.org/doc/userguide_5.6.1.pdf Thanks, Pierre > On 27 Jul 2023, at 3:50 PM, maitri ksh wrote: > > I am using 'MatMumpsGetInverse()' to get the inverse of a sparse matrix. I am > using parts of ex214.c >

Re: [petsc-users] MPICH C++ compilers when using PETSC --with-cxx=0

2023-07-21 Thread Pierre Jolivet
> On 21 Jul 2023, at 5:11 PM, Robert Crockett via petsc-users > wrote: > > Hello, > I built PETSc with –with-cxx=0 in order to get around a likely Intel C++ > compiler bug. > However, the MPICH that also gets built by PETSc then picks up the wrong C++ > compiler; mpicxx -show indicates that

Re: [petsc-users] [EXTERNAL] PETSc Installation Assistance

2023-07-17 Thread Pierre Jolivet
--- > mpiexec detected that one or more processes exited with non-zero status, thus > causing > the job to be terminated. The first process to do so was: > > Process name: [[33478,1],2] > Exit code:15 > ==

Re: [petsc-users] PETSc Installation Assistance

2023-07-17 Thread Pierre Jolivet
https://petsc.org/release/faq/#what-does-the-message-hwloc-linux-ignoring-pci-device-with-non-16bit-domain-mean Thanks, Pierre > On 17 Jul 2023, at 7:51 PM, Ferrand, Jesus A. wrote: > > Greetings. > > I recently changed operating systems (Ubuntu 20.04 -> Debian 12 "Bookworm") > and tried to

Re: [petsc-users] Near null space for a fieldsplit in petsc4py

2023-07-13 Thread Pierre Jolivet
Dear Nicolas, > On 13 Jul 2023, at 10:17 AM, TARDIEU Nicolas wrote: > > Dear Pierre, > > You are absolutely right. I was using a --with-debugging=0 (aka release) > install and this is definitely an error. > Once I used my debug install, I found the way to fix my problem. The solution > is in

Re: [petsc-users] Near null space for a fieldsplit in petsc4py

2023-07-12 Thread Pierre Jolivet
> On 12 Jul 2023, at 6:04 PM, TARDIEU Nicolas via petsc-users > wrote: > > Dear PETSc team, > > In the attached example, I set up a block pc for a saddle-point problem in > petsc4py. The IS define the unknowns, namely some physical quantity (phys) > and a Lagrange multiplier (lags). > I

Re: [petsc-users] Scalable Solver for Incompressible Flow

2023-06-23 Thread Pierre Jolivet
> On 23 Jun 2023, at 10:06 PM, Pierre Jolivet wrote: > > >> On 23 Jun 2023, at 9:39 PM, Alexander Lindsay >> wrote: >> >> Ah, I see that if I use Pierre's new 'full' option for >> -mat_schur_complement_ainv_type > > That was not initially done

Re: [petsc-users] Scalable Solver for Incompressible Flow

2023-06-23 Thread Pierre Jolivet
> On 23 Jun 2023, at 9:39 PM, Alexander Lindsay > wrote: > > Ah, I see that if I use Pierre's new 'full' option for > -mat_schur_complement_ainv_type That was not initially done by me (though I recently tweaked MatSchurComplementComputeExplicitOperator() a bit to use KSPMatSolve(), so that

Re: [petsc-users] parallel computing error

2023-05-05 Thread Pierre Jolivet
> On 5 May 2023, at 2:00 PM, ­권승리 / 학생 / 항공우주공학과 wrote: > > Dear Pierre Jolivet > > Thank you for your explanation. > > I will try to use a converting matrix. > > I know it's really inefficient, but I need an inverse matrix (inv(A)) itself > for my resear

Re: [petsc-users] parallel computing error

2023-05-05 Thread Pierre Jolivet
> On 5 May 2023, at 1:25 PM, ­권승리 / 학생 / 항공우주공학과 wrote: > > Dear Matthew Knepley > > However, I've already installed ScaLAPACK. > cd $PETSC_DIR > ./configure --download-mpich --with-debugging=0 COPTFLAGS='-O3 -march=native > -mtune=native' CXXOPTFLAGS='-O3 -march=native -mtune=native'

Re: [petsc-users] 'mpirun' run not found error

2023-05-02 Thread Pierre Jolivet
> On 2 May 2023, at 8:56 AM, ­권승리 / 학생 / 항공우주공학과 wrote: > > Dear developers > > I'm trying to use the mpi, but I'm encountering error messages like below: > > > Command 'mpirun' not found, but can be installed with: > sudo apt install lam-runtime # version 7.1.4-6build2, or >

Re: [petsc-users] Question about linking LAPACK library

2023-04-25 Thread Pierre Jolivet
> On 25 Apr 2023, at 11:43 AM, Matthew Knepley wrote: > > On Mon, Apr 24, 2023 at 11:47 PM ­권승리 / 학생 / 항공우주공학과 > wrote: >> Dear all >> >> It depends on the problem. It can have hundreds of thousands of degrees of >> freedom. > > Suppose your matrix was dense and

Re: [petsc-users] question about MatSetLocalToGlobalMapping

2023-04-20 Thread Pierre Jolivet
> On 20 Apr 2023, at 10:28 PM, Zhang, Hong wrote: > > Pierre, > 1) Is there any hope to get PDIPDM to use a MatNest? > > KKT matrix is indefinite and ill-conditioned, which must be solved using a > direct matrix factorization method. But you are using PCBJACOBI in the paper you attached? In

Re: [petsc-users] question about MatSetLocalToGlobalMapping

2023-04-20 Thread Pierre Jolivet
Hong, 1) Is there any hope to get PDIPDM to use a MatNest? 2) Is this fixed https://lists.mcs.anl.gov/pipermail/petsc-dev/2020-September/026398.html ? I cannot get users to transition away from Ipopt because of these two missing features. Thanks, Pierre > On 20 Apr 2023, at 5:47 PM, Zhang,

Re: [petsc-users] CG fails to converge in parallel

2023-04-20 Thread Pierre Jolivet
> On 20 Apr 2023, at 7:53 AM, Bojan Niceno > wrote: > > Dear all, > > > I am solving a Laplace equation with finite volume method on an unstructured > grid with a Fortran code I have developed, and PETSc 3.19 library. > > I first used cg solver with asm preconditioner, which converges

Re: [petsc-users] PCHPDDM and matrix type

2023-04-17 Thread Pierre Jolivet
1) PCHPDDM handles AIJ, BAIJ, SBAIJ, IS, NORMAL, NORMALHERMITIAN, SCHURCOMPLEMENT, HTOOL 2) This PC is based on domain decomposition, with no support yet for “over decomposition”. If you run with a single process, it’s like PCASM or PCBJACOBI, you’ll get the same behavior as if you were just

Re: [petsc-users] PETSc error only in debug build

2023-04-17 Thread Pierre Jolivet
> On 17 Apr 2023, at 6:22 PM, Matteo Semplice > wrote: > > Dear PETSc users, > > I am investigating a strange error occurring when using my code on a > cluster; I managed to reproduce it on my machine as well and it's weird: > > - on petsc3.19, optimized build, the code runs fine,

Re: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split

2023-04-16 Thread Pierre Jolivet
creating a PR. Thanks, Pierre > (Independently I’m excited to test HPDDM out tomorrow) > >> On Apr 13, 2023, at 10:29 PM, Pierre Jolivet wrote: >> >>  >>> On 14 Apr 2023, at 7:02 AM, Alexander Lindsay >>> wrote: >>> >>> Pierre, >>

Re: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split

2023-04-13 Thread Pierre Jolivet
y require some extra work (need a couple of extra options in PETSc ./configure), and this is not very related to the problem at hand, so best not to spam the mailing list. Thanks, Pierre >> On Apr 13, 2023, at 9:54 PM, Pierre Jolivet wrote: >> >>  >> >>> On 1

Re: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split

2023-04-13 Thread Pierre Jolivet
> On 13 Apr 2023, at 10:33 PM, Alexander Lindsay > wrote: > > Hi, I'm trying to solve steady Navier-Stokes for different Reynolds numbers. > My options table > > -dm_moose_fieldsplit_names u,p > -dm_moose_nfieldsplits 2 > -fieldsplit_p_dm_moose_vars pressure > -fieldsplit_p_ksp_type preonly

Re: [petsc-users] PETSc build asks for network connections

2023-03-20 Thread Pierre Jolivet
> On 20 Mar 2023, at 2:45 AM, Barry Smith wrote: > > > I found a bit more information in gmakefile.test which has the magic sauce > used by make test to stop the firewall popups while running the test suite. > > # MACOS FIREWALL HANDLING > # - if run with MACOS_FIREWALL=1 > #

Re: [petsc-users] Random Error of mumps: out of memory: INFOG(1)=-9

2023-03-04 Thread Pierre Jolivet
> On 4 Mar 2023, at 3:26 PM, Zongze Yang wrote: > >  > > >> On Sat, 4 Mar 2023 at 22:03, Pierre Jolivet wrote: >> >> >>>> On 4 Mar 2023, at 2:51 PM, Zongze Yang wrote: >>>> >>>> >>>> >>>> On

Re: [petsc-users] Random Error of mumps: out of memory: INFOG(1)=-9

2023-03-04 Thread Pierre Jolivet
> On 4 Mar 2023, at 2:51 PM, Zongze Yang wrote: > > > > On Sat, 4 Mar 2023 at 21:37, Pierre Jolivet <mailto:pie...@joliv.et>> wrote: >> >> >> > On 4 Mar 2023, at 2:30 PM, Zongze Yang > > <mailto:yangzon...@gmail.com>> wrote:

Re: [petsc-users] Random Error of mumps: out of memory: INFOG(1)=-9

2023-03-04 Thread Pierre Jolivet
> On 4 Mar 2023, at 2:30 PM, Zongze Yang wrote: > > Hi, > > I am writing to seek your advice regarding a problem I encountered while > using multigrid to solve a certain issue. > I am currently using multigrid with the coarse problem solved by PCLU. > However, the PC failed randomly with

Re: [petsc-users] Inquiry regarding DMAdaptLabel function

2023-02-27 Thread Pierre Jolivet
> On 27 Feb 2023, at 4:42 PM, Matthew Knepley wrote: > > On Mon, Feb 27, 2023 at 10:26 AM Pierre Jolivet <mailto:pie...@joliv.et>> wrote: >>> On 27 Feb 2023, at 4:16 PM, Matthew Knepley >> <mailto:knep...@gmail.com>> wrote: >>>

Re: [petsc-users] Inquiry regarding DMAdaptLabel function

2023-02-27 Thread Pierre Jolivet
> On 27 Feb 2023, at 4:16 PM, Matthew Knepley wrote: > > On Mon, Feb 27, 2023 at 10:13 AM Pierre Jolivet <mailto:pie...@joliv.et>> wrote: >>> On 27 Feb 2023, at 3:59 PM, Matthew Knepley >> <mailto:knep...@gmail.com>> wrote: >>&g

Re: [petsc-users] Inquiry regarding DMAdaptLabel function

2023-02-27 Thread Pierre Jolivet
> On 27 Feb 2023, at 3:59 PM, Matthew Knepley wrote: > > On Mon, Feb 27, 2023 at 9:53 AM Zongze Yang > wrote: >> Hi, Matt, >> >> I tested coarsening a mesh by using ParMMg without firedrake, and found some >> issues: >> see the code and results here:

Re: [petsc-users] DMPlex Halo Communication or Graph Partitioner Issue

2023-02-26 Thread Pierre Jolivet
> On 26 Feb 2023, at 8:07 PM, Mike Michell wrote: > > I cannot agree with this argument, unless you also tested with petsc 3.18.4 > tarball from https://petsc.org/release/install/download/. > If library has issue, it is trivial that I will see an error from my code. > > I ran my code with

Re: [petsc-users] petsc compiled without MPI

2023-02-25 Thread Pierre Jolivet
> On 25 Feb 2023, at 11:44 PM, Long, Jianbo wrote: > > Hello, > > For some of my applications, I need to use petsc without mpi, or use it > sequentially. I wonder where I can find examples/tutorials for this ? You can run sequentially with just a single MPI process (-n 1). If you need to

Re: [petsc-users] HYPRE requires C++ compiler

2023-02-20 Thread Pierre Jolivet
> On 21 Feb 2023, at 4:07 AM, Park, Heeho via petsc-users > wrote: > >  > Hi PETSc developers, > > I’m using the same configure script on my system to compile petsc-main branch > as petsc-v3.17.2, but now I am receiving this message. > I’ve tried it several different ways but HYPRE

Re: [petsc-users] Question about rank of matrix

2023-02-17 Thread Pierre Jolivet
> On 17 Feb 2023, at 8:56 AM, Stefano Zampini wrote: > > > On Fri, Feb 17, 2023, 10:43 user_gong Kim > wrote: >> Hello, >> >> I have a question about rank of matrix. >> At the problem >> Au = b, >> >> In my case, sometimes global matrix A is not full rank. >> In

Re: [petsc-users] Question about preconditioner

2023-02-16 Thread Pierre Jolivet
> On 16 Feb 2023, at 8:43 AM, user_gong Kim wrote: > >  > > > Hello, > > > > There are some questions about some preconditioners. > > The questions are from problem Au=b. The global matrix A has zero value > diagonal terms. > > 1. Which preconditioner is preferred for matrix A

Re: [petsc-users] MatMatMul inefficient

2023-02-15 Thread Pierre Jolivet
ing the matrices I > used. > I have trouble undrerstanding what is different in my case from the one you > referenced me to. > > Thank you so much, > Margherita > >> Il giorno 13 feb 2023, alle ore 3:51 PM, Pierre Jolivet ha >> scritto: >> >&

Re: [petsc-users] MatMatMul inefficient

2023-02-13 Thread Pierre Jolivet
Could you please share a reproducer? What you are seeing is not typical of the performance of such a kernel, both from a theoretical or a practical (see fig. 2 of https://joliv.et/article.pdf) point of view. Thanks, Pierre > On 13 Feb 2023, at 3:38 PM, Guido Margherita via petsc-users >

Re: [petsc-users] MatConvert changes distribution of local rows

2023-01-13 Thread Pierre Jolivet
one? Thanks, Pierre > > Gesendet: Freitag, 13. Januar 2023 um 16:58 Uhr > Von: "Pierre Jolivet" > An: "Marius Buerkle" > Cc: petsc-users@mcs.anl.gov > Betreff: Re: [petsc-users] MatConvert changes distribution of local rows > > On 13 Jan 2023, a

Re: [petsc-users] MatConvert changes distribution of local rows

2023-01-12 Thread Pierre Jolivet
> On 13 Jan 2023, at 8:49 AM, Marius Buerkle wrote: > > Hi, > > I have a matrix A for which I defined the number of local rows per process > manually using MatSetSizes. When I use MatConvert to change the matrix type > it changes the number of local rows (to what one would get if MatSetSize

Re: [petsc-users] Error running configure on HDF5 in PETSc-3.18.3

2023-01-06 Thread Pierre Jolivet
> On 6 Jan 2023, at 4:49 PM, Danyang Su wrote: > > Hi All, > > I get ‘Error running configure on HDF5’ in PETSc-3.18.3 on MacOS, but no > problem on Ubuntu. Attached is the configuration log file. > > ./configure --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-mumps >

Re: [petsc-users] error when trying to compile with HPDDM

2023-01-05 Thread Pierre Jolivet
> On 5 Jan 2023, at 7:06 PM, Matthew Knepley wrote: > > On Thu, Jan 5, 2023 at 11:36 AM Alfredo Jaramillo > wrote: >> Dear developers, >> I'm trying to compile petsc together with the HPDDM library. A series on >> errors appeared: >> >>

Re: [petsc-users] Installation With MSYS2 and MinGW Compilers

2022-12-06 Thread Pierre Jolivet
d of mpiexe Thanks, Pierre > Then I tried command > > mpif90.exe -n 2 ./ex5f90.exe > > I received many errors. > > Your helps for fixing these issue would be appreciated. > > Thanks, > Rajesh > > > > > > From: Pierre Jolivet >

Re: [petsc-users] Installation With MSYS2 and MinGW Compilers

2022-12-06 Thread Pierre Jolivet
iles. I am newb in this. >> Please let me know if you need further information. I will be glad to >> provide. >> >> Regards, >> Rajesh >> >> -Original Message- >> From: Satish Balay >> Sent: Monday, December 5, 2022 6:53 PM >> To

  1   2   >