Re: [petsc-users] Using BDDC preconditioner for assembled matrices

2018-10-23 Thread Abdullah Ali Sivas
Hi Jed, Thanks for your reply. The assembled matrix I have corresponds to the full problem on the full mesh. There are no "Neumann" problems (or any sort of domain decomposition) defined in the code generates the matrix. However, I think assembling the full problem is equivalent to implicitly

Re: [petsc-users] Using BDDC preconditioner for assembled matrices

2018-10-23 Thread Jed Brown
Did you assemble "Neumann" problems that are compatible with your definition of interior/interface degrees of freedom? Abdullah Ali Sivas writes: > Dear all, > > I have a series of linear systems coming from a PDE for which BDDC is an > optimal preconditioner. These linear systems are assembled

[petsc-users] Using BDDC preconditioner for assembled matrices

2018-10-23 Thread Abdullah Ali Sivas
Dear all, I have a series of linear systems coming from a PDE for which BDDC is an optimal preconditioner. These linear systems are assembled and I read them from a file, then convert into MATIS as required (as in

[petsc-users] Assistant Professorship in Computational Earth-Surface Process Modeling at CU Boulder

2018-10-23 Thread Jed Brown
The Institute of Arctic and Alpine Research (INSTAAR) and Community Surface Dynamics Modeling System (CSDMS) at the University of Colorado invite applications for a tenure-track assistant professor position in Computational Earth-Surface Process Modeling, with an August 2019 start. CSDMS is an

Re: [petsc-users] Problems about Compiling Multifile Program

2018-10-23 Thread Matthew Overholt
Correction: OBJFILES = \ class1.o \ class2.o \ myprogram.o Matt Overholt On Tue, Oct 23, 2018 at 2:29 PM Matthew Overholt wrote: > Here is a sample makefile, like what I use with the Intel MPI compilers > (used during PETSc configuration) and MKL library. > > Matt Overholt > >

Re: [petsc-users] Problems about Compiling Multifile Program

2018-10-23 Thread Matthew Overholt
Here is a sample makefile, like what I use with the Intel MPI compilers (used during PETSc configuration) and MKL library. Matt Overholt - # # makefile for Linux using the Intel C++ Compiler, MKL & MPI libraries + OpenMP # usage: make # or: make

Re: [petsc-users] Problems about Compiling Multifile Program

2018-10-23 Thread Matthew Knepley
On Tue, Oct 23, 2018 at 11:37 AM Yingjie Wu wrote: > Dear Petsc developer: > Hi, > Thank you very much for your continuous help, I recently encountered some > difficulties in developing programs on Petsc. > > 1. I want to use my class definition (class1. h) and class functions > (class1. cpp)

[petsc-users] Problems about Compiling Multifile Program

2018-10-23 Thread Yingjie Wu
Dear Petsc developer: Hi, Thank you very much for your continuous help, I recently encountered some difficulties in developing programs on Petsc. 1. I want to use my class definition (class1. h) and class functions (class1. cpp) files in my Petsc program (myprogram. c) and compile my program. I

Re: [petsc-users] Slepc: Nonlinear eigenvalue problem

2018-10-23 Thread Matthew Knepley
On Tue, Oct 23, 2018 at 10:53 AM Manav Bhatia wrote: > Really interesting! > > So this is a limitation of the algorithm and not the implementation. > > The challenge is that the eigenvalue solution in my workflow is a small > component of a large computation done with real numbers in an

Re: [petsc-users] Slepc: Nonlinear eigenvalue problem

2018-10-23 Thread Manav Bhatia
Really interesting! So this is a limitation of the algorithm and not the implementation. The challenge is that the eigenvalue solution in my workflow is a small component of a large computation done with real numbers in an optimization problem. I could do the whole thing with complex

Re: [petsc-users] [SLEPc] Number of iterations changes with MPI processes in Lanczos

2018-10-23 Thread Jose E. Roman
> El 23 oct 2018, a las 15:46, Ale Foggia escribió: > > > > El mar., 23 oct. 2018 a las 15:33, Jose E. Roman () > escribió: > > > > El 23 oct 2018, a las 15:17, Ale Foggia escribió: > > > > Hello Jose, thanks for your answer. > > > > El mar., 23 oct. 2018 a las 12:59, Jose E. Roman ()

Re: [petsc-users] Slepc: Nonlinear eigenvalue problem

2018-10-23 Thread Jose E. Roman
> El 23 oct 2018, a las 16:10, Manav Bhatia escribió: > > Thanks for the clarification. > > Does this also apply to the standard non-hermitian eigenvalue problem? Do I > need to compile with complex numbers if I want to capture the complex > eigenvalues? Or does it work with real number

Re: [petsc-users] DMPlex, output value on gauss point to vertex

2018-10-23 Thread Matthew Knepley
On Tue, Oct 23, 2018 at 3:12 AM Josh L wrote: > Hi, > > My FEM codes calculate the stress field on gauss point, but when I output > the stress field, I want it to on vertex. My old codes find the gauss > points that surround a vertex and calculate the average. > Is there any better way to do it

Re: [petsc-users] Slepc: Nonlinear eigenvalue problem

2018-10-23 Thread Manav Bhatia
Thanks for the clarification. Does this also apply to the standard non-hermitian eigenvalue problem? Do I need to compile with complex numbers if I want to capture the complex eigenvalues? Or does it work with real number support? Thanks Manav Sent from my iPhone > On Oct 23, 2018, at 3:43

Re: [petsc-users] [SLEPc] Number of iterations changes with MPI processes in Lanczos

2018-10-23 Thread Ale Foggia
El mar., 23 oct. 2018 a las 15:33, Jose E. Roman () escribió: > > > > El 23 oct 2018, a las 15:17, Ale Foggia escribió: > > > > Hello Jose, thanks for your answer. > > > > El mar., 23 oct. 2018 a las 12:59, Jose E. Roman () > escribió: > > There is an undocumented option: > > > >

Re: [petsc-users] [SLEPc] Number of iterations changes with MPI processes in Lanczos

2018-10-23 Thread Jose E. Roman
> El 23 oct 2018, a las 15:17, Ale Foggia escribió: > > Hello Jose, thanks for your answer. > > El mar., 23 oct. 2018 a las 12:59, Jose E. Roman () > escribió: > There is an undocumented option: > > -bv_reproducible_random > > It will force the initial vector of the Krylov subspace to

Re: [petsc-users] [SLEPc] Number of iterations changes with MPI processes in Lanczos

2018-10-23 Thread Ale Foggia
Hello Jose, thanks for your answer. El mar., 23 oct. 2018 a las 12:59, Jose E. Roman () escribió: > There is an undocumented option: > > -bv_reproducible_random > > It will force the initial vector of the Krylov subspace to be the same > irrespective of the number of MPI processes. This should

Re: [petsc-users] [SLEPc] Number of iterations changes with MPI processes in Lanczos

2018-10-23 Thread Matthew Knepley
On Tue, Oct 23, 2018 at 6:24 AM Ale Foggia wrote: > Hello, > > I'm currently using Lanczos solver (EPSLANCZOS) to get the smallest real > eigenvalue (EPS_SMALLEST_REAL) of a Hermitian problem (EPS_HEP). Those are > the only options I set for the solver. My aim is to be able to > predict/estimate

Re: [petsc-users] [SLEPc] Number of iterations changes with MPI processes in Lanczos

2018-10-23 Thread Jose E. Roman
There is an undocumented option: -bv_reproducible_random It will force the initial vector of the Krylov subspace to be the same irrespective of the number of MPI processes. This should be used for scaling analyses as the one you are trying to do. An additional comment is that we strongly

[petsc-users] [SLEPc] Number of iterations changes with MPI processes in Lanczos

2018-10-23 Thread Ale Foggia
Hello, I'm currently using Lanczos solver (EPSLANCZOS) to get the smallest real eigenvalue (EPS_SMALLEST_REAL) of a Hermitian problem (EPS_HEP). Those are the only options I set for the solver. My aim is to be able to predict/estimate the time-to-solution. To do so, I was doing a scaling of the

Re: [petsc-users] Slepc: Nonlinear eigenvalue problem

2018-10-23 Thread Jose E. Roman
If eigenvalues are complex then NLEIGS also needs to work in complex arithmetic because it needs a region of the complex plane containing the wanted eigenvalues. It seems that complex arithmetic is the only change in your problem. Jose > El 22 oct 2018, a las 22:01, Manav Bhatia escribió: >

Re: [petsc-users] DMPlex, output value on gauss point to vertex

2018-10-23 Thread Yann Jobic
Hi, You may want to look at : https://www.mcs.anl.gov/petsc/petsc-current/include/petsc/private/petscfeimpl.h.html#PetscFEInterpolate_Static It only gives the structure of what you are looking for. In order to use it, you have to use the FE implementation of Petsc, by looking at the examples.

Re: [petsc-users] Shell Matrix Operations required for KSP solvers?

2018-10-23 Thread Dave May
On Tue, 23 Oct 2018 at 02:24, Matthew Knepley wrote: > On Mon, Oct 22, 2018 at 7:44 PM Andrew Ho wrote: > >> I have a specialized matrix structure I'm trying to take advantage of for >> solving large scale (non)linear systems. I think for this purpose using a >> Shell matrix is sufficient for

[petsc-users] DMPlex, output value on gauss point to vertex

2018-10-23 Thread Josh L
Hi, My FEM codes calculate the stress field on gauss point, but when I output the stress field, I want it to on vertex. My old codes find the gauss points that surround a vertex and calculate the average. Is there any better way to do it with DMPlex? The PetscSF given by DMPlexDistribute has