Re: [petsc-users] Matrix-free generalised eigenvalue problem

2023-07-17 Thread Quentin Chevalier
Thank you for that pointer ! I have on hand a partial SVD of R, so I used that to build the approximate matrix instead. It's great that so many nice features of PETSc like STSetPreconditionerMat are accessible through petsc4py ! Good day, Quentin On Mon, 17 Jul 2023 at 17:29, Jose E. Roman

Re: [petsc-users] Bug Report TaoALMM class

2023-07-17 Thread Matthew Knepley
Toby and Hansol, Has anyone looked at this? Thanks, Matt On Mon, Jun 12, 2023 at 8:24 AM Stephan Köhler < stephan.koeh...@math.tu-freiberg.de> wrote: > Dear PETSc/Tao team, > > I think there might be a bug in the Tao ALMM class: In the function >

Re: [petsc-users] Inquiry about reading the P2 tetrahedron mesh from GMSH

2023-07-17 Thread Matthew Knepley
On Fri, Jun 30, 2023 at 4:40 PM neil liu wrote: > Dear Petsc developers, > > I am reading P2 mesh from GMSH. And used DMFieldGetClosure_Internal to > check the coordinates for each tetrahedron, It seems reasonable. > But when I tried DMGetCoordinates (dm, ), it seems the vector > global is not

Re: [petsc-users] [EXTERNAL] PETSc Installation Assistance

2023-07-17 Thread Matthew Knepley
Hi Jesus, I think you are on main. Did everything you have get rebuilt? Toby just rewrote large sections of logging and this is right where it fails for you. It should be easy to see what is wrong by running in the debugger. Thanks, Matt On Mon, Jul 17, 2023 at 3:11 PM Pierre Jolivet

Re: [petsc-users] Parallel matrix multiplication

2023-07-17 Thread Matthew Knepley
On Mon, Jul 17, 2023 at 2:22 PM Barry Smith wrote: > > >https://petsc.org/release/manualpages/Mat/MatPtAP/ also note that > PETSc has a large infrastructure for efficient ways to compute various > matrix-matrix operations with a variety of algorithms that can be all > accessed by starting

Re: [petsc-users] Structured (DMDA) vs Unstructured (DMPlex) meshes

2023-07-17 Thread Matthew Knepley
On Mon, Jul 17, 2023 at 12:48 PM Barry Smith wrote: > >The largest potential advantage of DMDA is likely the possibility of > easily using geometric multigrid if it is appropriate for the problem (or > subproblem of the problem) you are solving. The second advantage is, this > depends on

Re: [petsc-users] [EXTERNAL] PETSc Installation Assistance

2023-07-17 Thread Pierre Jolivet
> On 17 Jul 2023, at 9:00 PM, Ferrand, Jesus A. wrote: > > Pierre: > Setting the environment variable allows make check to complete without errors. > It only seems to run three checks. > I recall that in the past, make check would run 100+ tests. You are probably thinking of make test, which

Re: [petsc-users] PETSc Installation Assistance

2023-07-17 Thread Satish Balay via petsc-users
On Mon, 17 Jul 2023, Pierre Jolivet wrote: > https://petsc.org/release/faq/#what-does-the-message-hwloc-linux-ignoring-pci-device-with-non-16bit-domain-mean > > On 17 Jul 2023, at 7:51 PM, Ferrand, Jesus A. wrote: > > hwloc/linux: Ignoring PCI device with non-16bit domain. > > Pass

Re: [petsc-users] petscconf.h missing building cpp file with dolfinx?

2023-07-17 Thread Satish Balay via petsc-users
PETSc supports both --prefix and inplace install. Suggest using inplace install - in your $HOME: For ex: >>> balay@p1 /home/balay $ tar -xzf petsc-3.19. petsc-3.19.2.tar.gz petsc-3.19.3.tar.gz balay@p1 /home/balay $ tar -xzf petsc-3.19.3.tar.gz balay@p1 /home/balay $ cd petsc-3.19.3

Re: [petsc-users] petscconf.h missing building cpp file with dolfinx?

2023-07-17 Thread Barry Smith
When configuring and making PETSc, PETSC_DIR can be empty or point to the directory with the PETSc source If you used --prefix to configure and install PETSc then When using PETSc to compile other source code and using a makefile that utilizes PETSC_DIR, then PETSC_DIR needs to

Re: [petsc-users] Parallel matrix multiplication

2023-07-17 Thread Barry Smith
https://petsc.org/release/manualpages/Mat/MatPtAP/ also note that PETSc has a large infrastructure for efficient ways to compute various matrix-matrix operations with a variety of algorithms that can be all accessed by starting with

Re: [petsc-users] petscconf.h missing building cpp file with dolfinx?

2023-07-17 Thread philliprusso via petsc-users
So I have this to go by: export PETSC_DIR=/absolute/path/to/petsc export PETSC_ARCH=linux-gnu-c-debug PETSC_DIR Is that the installation/destination path or the path where the source code is? I feel confused So there is a destination folder that dolfinx project wnats for petsc. I have an

Re: [petsc-users] PETSc Installation Assistance

2023-07-17 Thread Pierre Jolivet
https://petsc.org/release/faq/#what-does-the-message-hwloc-linux-ignoring-pci-device-with-non-16bit-domain-mean Thanks, Pierre > On 17 Jul 2023, at 7:51 PM, Ferrand, Jesus A. wrote: > > Greetings. > > I recently changed operating systems (Ubuntu 20.04 -> Debian 12 "Bookworm") > and tried to

[petsc-users] PETSc Installation Assistance

2023-07-17 Thread Ferrand, Jesus A.
Greetings. I recently changed operating systems (Ubuntu 20.04 -> Debian 12 "Bookworm") and tried to reinstall PETSc. I tried doing the usual as described in (https://petsc.org/release/install/download/#recommended-obtain-release-version-with-git): * git clone/pull * ./configure -- ...

[petsc-users] Parallel matrix multiplication

2023-07-17 Thread Karthikeyan Chockalingam - STFC UKRI via petsc-users
Hello, I would like to perform the following operation [P^T A P] x’ = P^T f Where P is a rectangular matrix and A is a square matrix. All the matrixes are constructed using MPIAIJ. Should I be concerned about the parallel partitioning of the matrix P and A? Or can I just go ahead and use

Re: [petsc-users] Structured (DMDA) vs Unstructured (DMPlex) meshes

2023-07-17 Thread Barry Smith
The largest potential advantage of DMDA is likely the possibility of easily using geometric multigrid if it is appropriate for the problem (or subproblem of the problem) you are solving. The second advantage is, this depends on your PDE and discretization, the simplicity of your code, and

[petsc-users] Structured (DMDA) vs Unstructured (DMPlex) meshes

2023-07-17 Thread Miguel Angel Salazar de Troya
Hello, I am trying to understand if I should make the effort to make my code use structured meshes instead of unstructured ones. My domain is cartesian so that is the first check for structured meshes. However, the problem size I am looking at is ~20 million degrees of freedom. My understanding

Re: [petsc-users] Matrix-free generalised eigenvalue problem

2023-07-17 Thread Jose E. Roman
It is possible to pass a different matrix to build the preconditioner. That is, the shell matrix for B (EPSSetOperators) and an explicit matrix (that approximates B) for the preconditioner. For instance, you can try passing M for building the preconditioner. Since M is an explicit matrix, you

Re: [petsc-users] petscconf.h missing building cpp file with dolfinx?

2023-07-17 Thread Satish Balay via petsc-users
We do not recommend installing PETSc in /usr/include [aka --prefix=/usr]. Now you have some petsc includes here that will potentially conflict with any other install of PETSc you might attempt. You might have to manually check - and delete PETSc installed files from here. For most uses - one

Re: [petsc-users] petscconf.h missing building cpp file with dolfinx?

2023-07-17 Thread Matthew Knepley
On Mon, Jul 17, 2023 at 10:55 AM philliprusso via petsc-users < petsc-users@mcs.anl.gov> wrote: > I cloned petsc source for use with dolfinx project. So after .configure > mak sudo make install I found there was some type of difficulty with the > destination directory so I copied the files

[petsc-users] petscconf.h missing building cpp file with dolfinx?

2023-07-17 Thread philliprusso via petsc-users
I cloned petsc source for use with dolfinx project. So after .configure mak sudo make install I found there was some type of difficulty with the destination directory so I copied the files manually into usr/includes of Ubuntu 22.04 jammy. So some petsc header files are now found for compiling

Re: [petsc-users] Using PETSc GPU backend

2023-07-17 Thread Barry Smith
The examples that use DM, in particular DMDA all trivially support using the GPU with -dm_mat_type aijcusparse -dm_vec_type cuda > On Jul 17, 2023, at 1:45 AM, Ng, Cho-Kuen wrote: > > Barry, > > Thank you so much for the clarification. > > I see that ex104.c and ex300.c use

Re: [petsc-users] Matrix-free generalised eigenvalue problem

2023-07-17 Thread Quentin Chevalier
Thank you for this suggestion, I tried to implement that but it's proven pretty hard to implement MATOP_GET_DIAGONAL without completely tanking performance. After all, B is a shell matrix for a reason : it looks like M+R^H P M P R with R itself a shell matrix. Allow me to point out that I have no

Re: [petsc-users] windows build

2023-07-17 Thread Matthew Knepley
On Mon, Jul 17, 2023 at 6:08 AM Константин via petsc-users < petsc-users@mcs.anl.gov> wrote: > Hello. I have such problem, while I'm making petsc > $ make > C:/cygwin64/home/itugr/asd/petsc//lib/petsc/conf/petscvariables:140: *** > Too many open files. Stop. > Are there any advices how to fix

Re: [petsc-users] windows build

2023-07-17 Thread hamid badi
For me, I got no problem when compiling Petsc with Mumps under mingw/clang or gcc, everything works fine. May be i could write a tutorial when I'll have time, now i'm on holidays. Le lun. 17 juil. 2023 à 12:08, Константин via petsc-users < petsc-users@mcs.anl.gov> a écrit : > Hello. I have

Re: [petsc-users] windows build

2023-07-17 Thread Константин via petsc-users
Hello. I have such problem, while I'm making petsc $ make C:/cygwin64/home/itugr/asd/petsc//lib/petsc/conf/petscvariables:140: *** Too many open files.  Stop.  Are there any advices how to fix it?     -- Константин     >Вторник, 11 июля 2023, 23:09 +03:00 от Satish Balay : >  >On Tue, 11 Jul

Re: [petsc-users] Matrix-free generalised eigenvalue problem

2023-07-17 Thread Jose E. Roman
The B-inner product is independent of the ST operator. See Table 3.2. In generalized eigenproblems you always have an inverse. If your matrix is diagonally dominant, try implementing the MATOP_GET_DIAGONAL operation and using PCJACOBI. Apart from this, you have to build your own

Re: [petsc-users] Matrix-free generalised eigenvalue problem

2023-07-17 Thread Quentin Chevalier
Hello Jose, I guess I expected B to not be inverted but instead used as a mass for a problem-specific inner product since I specified GHEP as a problem type. p50 of the same user manual seems to imply that that would indeed be the case. I don't see