Re: [petsc-users] reuse a real matrix for a second linear system with complex numbers

2021-05-14 Thread Matthew Knepley via petsc-users
On Fri, May 14, 2021 at 4:23 AM feng wang wrote: > Dear All, > > I am solving a coupled system. One system is AX=B. A, X and B are all real > numbers and it is solved with GMRES in petsc. Now I need to solve a second > linear system, it can be represented as (A+i*w)*Z=C. i is the imaginary >

Re: [petsc-users] solve problem with pastix

2019-11-05 Thread Matthew Knepley via petsc-users
I have no idea. That is a good question for the PasTix list. Thanks, Matt On Tue, Nov 5, 2019 at 5:32 PM hg wrote: > Should thread affinity be invoked? I set -mat_pastix_threadnbr 1 and also > OMP_NUM_THREADS to 1 > > Giang > > > On Tue, Nov 5, 2019 at 10:50 PM Matthew Knepley wrote:

Re: [petsc-users] PETSc 3.12 with .f90 files

2019-10-29 Thread Matthew Knepley via petsc-users
On Tue, Oct 29, 2019 at 10:54 AM Randall Mackie via petsc-users < petsc-users@mcs.anl.gov> wrote: > Dear PETSc users: > > In our code, we have one or two small .f90 files that are part of the > software, and they have always compiled without any issues with previous > versions of PETSc, using

Re: [petsc-users] calculating eigenvalues for the Stokes equations in 3D

2019-10-29 Thread Matthew Knepley via petsc-users
On Tue, Oct 29, 2019 at 5:06 AM Dave May via petsc-users < petsc-users@mcs.anl.gov> wrote: > > > On Tue, 29 Oct 2019 at 08:59, wrote: > >> Hi, Dave! Thank you for your assistance. The problem is that I don't have >> the matrix representation for my problem. >> > > You will have to explain in

Re: [petsc-users] R: AMG Variable block preconditioner

2019-10-28 Thread Matthew Knepley via petsc-users
On Mon, Oct 28, 2019 at 12:37 PM Marco Cisternino via petsc-users < petsc-users@mcs.anl.gov> wrote: > Hi Stefano, > thanks for your interest. > Mainly, elliptic operators both in the context of methods in NS solvers > and in the context of mesh deformation. > I didn't think about AMG on each

Re: [petsc-users] 'Inserting a new nonzero' issue on a reassembled matrix in parallel

2019-10-24 Thread Matthew Knepley via petsc-users
On Thu, Oct 24, 2019 at 6:09 AM Thibaut Appel wrote: > Hi Matthew, > > Thanks for having a look, your example runs just like mine in Fortran. > > In serial, the value (0.0,0.0) was inserted whereas it shouldn't have. > I do not see that in serial. > In parallel, you'll see that an error

Re: [petsc-users] 'Inserting a new nonzero' issue on a reassembled matrix in parallel

2019-10-23 Thread Matthew Knepley via petsc-users
On Tue, Oct 22, 2019 at 1:37 PM Thibaut Appel wrote: > Hi both, > > Please find attached a tiny example (in Fortran, sorry Matthew) that - I > think - reproduces the problem we mentioned. > > Let me know. > > Okay, I converted to C so I could understand, and it runs fine for me: master

Re: [petsc-users] Using petsc with an existing domain decomposition.

2019-10-13 Thread Matthew Knepley via petsc-users
Without having seen your code, it sounds to me like the best strategy here is to: 1) Produce a mirror of your mesh using DMStag 2) Use that DM to construct the linear block problems, which can then be solved by PETSc Since the PETSc grid matches your own, you can share the solution

Re: [petsc-users] DMPlex memory problem in scaling test

2019-10-10 Thread Matthew Knepley via petsc-users
On Thu, Oct 10, 2019 at 9:00 PM Danyang Su wrote: > Labels should be destroyed with the DM. Just make a small code that does > nothing but distribute the mesh and end. If you > run with -malloc_test you should see if everythign is destroyed properly. > > Thanks, > > Matt > > Attached is

Re: [petsc-users] DMPlex memory problem in scaling test

2019-10-10 Thread Matthew Knepley via petsc-users
a local graph >>> on each processor to give to say Parmetis, then completes the distribution >>> with this reasonable partitioning? (this is what our current code does) >>> >>> Thanks, >>> Mark >>> >>> On Thu, Oct 10, 2019

Re: [petsc-users] DMPlex memory problem in scaling test

2019-10-10 Thread Matthew Knepley via petsc-users
On Thu, Oct 10, 2019 at 8:31 AM Dave May wrote: > On Thu, 10 Oct 2019 at 13:21, Matthew Knepley via petsc-users < > petsc-users@mcs.anl.gov> wrote: > >> On Wed, Oct 9, 2019 at 5:10 PM Danyang Su via petsc-users < >> petsc-users@mcs.anl.gov> wrote: >>

Re: [petsc-users] Stokes-Brinkmann equation preconditioner

2019-10-04 Thread Matthew Knepley via petsc-users
On Fri, Oct 4, 2019 at 6:04 AM Lawrence Mitchell wrote: > > > > On 4 Oct 2019, at 10:46, Matthew Knepley via petsc-users < > petsc-users@mcs.anl.gov> wrote: > > > > On Thu, Oct 3, 2019 at 6:34 PM Salazar De Troya, Miguel via petsc-users < > petsc-us

Re: [petsc-users] Makefile change for PETSc3.12.0???

2019-10-02 Thread Matthew Knepley via petsc-users
On Wed, Oct 2, 2019 at 1:54 PM Danyang Su via petsc-users < petsc-users@mcs.anl.gov> wrote: > Dear All, > > I installed PETSc3.12.0 version and got problem in compiling my code > (Fortran and C++). The code and makefile are the same as I used for > previous PETSc version. > > The error

Re: [petsc-users] TS scheme with different DAs

2019-09-18 Thread Matthew Knepley via petsc-users
On Tue, Sep 17, 2019 at 8:27 PM Smith, Barry F. wrote: > > Don't be too quick to dismiss switching to the DMStag you may find that > it actually takes little time to convert and then you have a much less > cumbersome process to manage the staggered grid. Take a look at >

Re: [petsc-users] DMPlex cell number containing a point in space

2019-09-16 Thread Matthew Knepley via petsc-users
On Fri, Sep 6, 2019 at 6:07 PM Swarnava Ghosh via petsc-users < petsc-users@mcs.anl.gov> wrote: > Dear Petsc developers and users, > > I have a DMPlex mesh in 3D. Given a point with (x,y,z) coordinates, I am > trying the find the cell number in which this point lies, and the vertices > of the

Re: [petsc-users] DMPlex for cell centred finite volume

2019-08-27 Thread Matthew Knepley via petsc-users
On Tue, Aug 27, 2019 at 4:07 AM Edoardo alinovi via petsc-users < petsc-users@mcs.anl.gov> wrote: > Hello PETSc users and developers, > I hope you are doing well! Today I have a general question about DMplex to > see if it can be usefull to me or not. > > I have my fancy finite volume solver

Re: [petsc-users] Getting the connectivity from DMPlex

2019-08-21 Thread Matthew Knepley via petsc-users
On Wed, Aug 21, 2019 at 8:35 AM Jian Zhang - 3ME wrote: > Hi Matthew, > > That is ok. May I ask you one more question? > > Sorry. I just start to learn how to use DMPLex. If I have an msh file > including the triangular elements (t3). After I use DMPlex to read this > mesh, how can I know the

Re: [petsc-users] Creating a 3D dmplex mesh with cell list and distributing it

2019-08-14 Thread Matthew Knepley via petsc-users
On Wed, Aug 14, 2019 at 10:23 PM Swarnava Ghosh wrote: > Hi Matthew, > > I added DMView(pCgdft->dmplex,PETSC_VIEWER_STDOUT_WORLD); before and after > distribution, and I get the following: > It looks like you are running things with the wrong 'mpirun' Thanks, Matt > dmplex before

Re: [petsc-users] confused by the converged reason output

2019-07-19 Thread Matthew Knepley via petsc-users
On Fri, Jul 19, 2019 at 11:21 AM Michael Wick wrote: > Yes, it returns: > > Linear m_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 3 > 0 KSP preconditioned resid norm 5.402205955230e-11 true resid norm 9.999870838355e-01 ||r(i)||/||b|| 1.e+00 1 KSP preconditioned

Re: [petsc-users] Various Questions Regarding PETSC

2019-07-13 Thread Matthew Knepley via petsc-users
On Sat, Jul 13, 2019 at 11:20 AM Mohammed Mostafa wrote: > I am sorry but I don’t see what you mean by small times > Although mat assembly is relatively smaller > The cost of mat set values is still significant > The same can be said for vec assembly > Combined vec/mat assembly and matsetvalues

Re: [petsc-users] DMPlexDistributeField

2019-07-10 Thread Matthew Knepley via petsc-users
Crap! The wrong thing got pushed. Thanks, Matt On Wed, Jul 10, 2019 at 9:49 PM Adrian Croucher wrote: > Probably don't want the two extra PetscSFView() calls (lines 1687, 1688) > though- presumably they were just for temporary debugging? > > - Adrian > On 11/07/19 2:18 PM, Adrian

Re: [petsc-users] createASCII cleans up the content of a gmsh file.

2019-06-21 Thread Matthew Knepley via petsc-users
On Fri, Jun 21, 2019 at 4:56 AM Dongyu Liu - CITG via petsc-users < petsc-users@mcs.anl.gov> wrote: > Hi, > > we are using the Viewer class in pets4py to read a gmsh file, but after we > use the function createASCII with the mode "READ", the gmsh file is > emptied. Do you have any clue why this

Re: [petsc-users] Memory growth issue

2019-06-03 Thread Matthew Knepley via petsc-users
On Mon, Jun 3, 2019 at 6:56 PM Zhang, Junchao via petsc-users < petsc-users@mcs.anl.gov> wrote: > On Mon, Jun 3, 2019 at 5:23 PM Stefano Zampini > wrote: > >> >> >> On Jun 4, 2019, at 1:17 AM, Zhang, Junchao via petsc-users < >> petsc-users@mcs.anl.gov> wrote: >> >> Sanjay & Barry, >> Sorry, I

Re: [petsc-users] parallel dual porosity

2019-05-29 Thread Matthew Knepley via petsc-users
On Wed, May 29, 2019 at 10:54 PM Adrian Croucher wrote: > On 30/05/19 2:45 PM, Matthew Knepley wrote: > > > Hmm, I had not thought about that. It will not do that at all. We have > never rebalanced a simulation > using overlap cells. I would have to write the code that strips them out. > Not

Re: [petsc-users] parallel dual porosity

2019-05-29 Thread Matthew Knepley via petsc-users
On Wed, May 29, 2019 at 10:38 PM Adrian Croucher wrote: > hi > On 28/05/19 11:32 AM, Matthew Knepley wrote: > > > I would not do that. It should be much easier, and better from a workflow > standpoint, > to just redistribute in parallel. We now have several test examples that > redistribute > in

Re: [petsc-users] Problem coupling Petsc into OpenFOAM

2019-05-24 Thread Matthew Knepley via petsc-users
On Thu, May 23, 2019 at 10:41 PM Vu Q. Do via petsc-users < petsc-users@mcs.anl.gov> wrote: > Hi all, > > Thanks for your previous suggestion, I have been able to successfully link > Petsc to OpenFOAM. I have written a simple interface and it works quite > well in serial mode, but cannot run in

Re: [petsc-users] Singlar values of the GMRES Hessenberg matrix

2019-05-24 Thread Matthew Knepley via petsc-users
On Fri, May 24, 2019 at 8:38 AM Dave Lee wrote: > Thanks Matt, great suggestion. > > I did indeed find a transpose error this way. The SVD as reconstructed via > U S V^T now matches the input Hessenberg matrix as derived via the > *HES(row,col) macro, and all the singular values are non-zero.

Re: [petsc-users] problem with generating simplicies mesh

2019-05-20 Thread Matthew Knepley via petsc-users
On Sun, May 19, 2019 at 9:22 AM 陳鳴諭 via petsc-users wrote: > I have problem with generating simplicies mesh. > I do as the description in DMPlexCreateBoxmesh says, but still meet error. > Stefano is right that you will need a mesh generator for a simplex mesh. However, you are asking for a 1D

Re: [petsc-users] Argument out of range error in MatPermute

2019-04-24 Thread Matthew Knepley via petsc-users
On Wed, Apr 24, 2019 at 6:35 AM Stefano Zampini via petsc-users < petsc-users@mcs.anl.gov> wrote: > Dump the index sets and the matrix in binary and send them > First, reduce your problem size to about 10. Matt > Il giorno mer 24 apr 2019 alle ore 13:21 Eda Oktay > ha scritto: > >> Since I

Re: [petsc-users] VecView to hdf5 broken for large (complex) vectors

2019-04-16 Thread Matthew Knepley via petsc-users
On Tue, Apr 16, 2019 at 3:44 PM Sajid Ali wrote: > So, I tried running the debug version with valgrind to see if I can find > the chunk size that's being set but I don't see it. Is there a better way > to do it ? > > `$ mpirun -np 32 valgrind ./ex_ms -prop_steps 1 -info &> out`. [The out > file

Re: [petsc-users] Error with VecDestroy_MPIFFTW+0x61

2019-04-14 Thread Matthew Knepley via petsc-users
On Sun, Apr 14, 2019 at 9:12 PM Sajid Ali wrote: > Just to confirm, there's no error when running with one rank. The error > occurs only with mpirun -np x (x>1). > This is completely broken. I attached a version that will work in parallel, but its ugly. PETSc People: The MatCreateVecsFFT()

Re: [petsc-users] Error with KSPSetUp and MatNest

2019-04-10 Thread Matthew Knepley via petsc-users
On Wed, Apr 10, 2019 at 12:49 PM Manuel Colera Rico wrote: > Thank you for your answer, Matt. In the MWE example attached before, both > Nest vectors (the r.h.s. of the system and the vector of unknowns) are > composed of the same number of blocks (2). Indeed, PETSc is able to solve > the system

Re: [petsc-users] Strange compiling error in DMPlexDistribute after updating PETSc to V3.11.0

2019-04-05 Thread Matthew Knepley via petsc-users
On Fri, Apr 5, 2019 at 4:44 PM Danyang Su via petsc-users < petsc-users@mcs.anl.gov> wrote: > Hi All, > > I got a strange error in calling DMPlexDistribute after updating PETSc > to V3.11.0. There sounds no change in the interface of DMPlexDistribute > as documented in > > >

Re: [petsc-users] ASCIIRead error for multiple processors

2019-04-05 Thread Matthew Knepley via petsc-users
On Fri, Apr 5, 2019 at 10:27 AM Yuyun Yang wrote: > Hmm ok. Then should I use this function or not when I'm reading the input? > It's probably still going to give me the same error and unable to proceed? > > I'd like to know if I should use something else to work around this > problem. > No,

Re: [petsc-users] error: Petsc has generated inconsistent data, MPI_Allreduce() called in different locations (code lines) on different processors

2019-04-05 Thread Matthew Knepley via petsc-users
On Fri, Apr 5, 2019 at 3:20 AM Eda Oktay via petsc-users < petsc-users@mcs.anl.gov> wrote: > Hello, > > I am trying to calculate unweighted Laplacian of a matrix by using 2 > cores. If the size of matrix is in even number then my program works. > However, when I try to use a matrix having odd

Re: [petsc-users] testing for and removing a null space using JFNK

2019-04-04 Thread Matthew Knepley via petsc-users
On Thu, Apr 4, 2019 at 7:36 AM Dave Lee via petsc-users < petsc-users@mcs.anl.gov> wrote: > Thanks Mark, > > I already have the Navier Stokes solver. My issue is wrapping it in a JFNK > solver to find the periodic solutions. I will keep reading up on SVD > approaches, there may be some capability

Re: [petsc-users] Consistent domain decomposition between DMDA and DMPLEX

2019-03-28 Thread Matthew Knepley via petsc-users
4:08 PM Mark Adams wrote: >>>>>>>> >>>>>>>>> Matt, >>>>>>>>> I think they want a vertex partitioning. They may have elements on >>>>>>>>> the unstructured mesh that intersect with any number

Re: [petsc-users] Bad memory scaling with PETSc 3.10

2019-03-26 Thread Matthew Knepley via petsc-users
On Tue, Mar 26, 2019 at 9:27 AM Myriam Peyrounette < myriam.peyroune...@idris.fr> wrote: > I checked with -ksp_view (attached) but no prefix is associated with the > matrix. Some are associated to the KSP and PC, but none to the Mat > Another thing that could prevent options being used is that

Re: [petsc-users] BJACOBI with FIELDSPLIT

2019-03-18 Thread Matthew Knepley via petsc-users
On Mon, Mar 18, 2019 at 3:56 PM Rossi, Simone via petsc-users < petsc-users@mcs.anl.gov> wrote: > To follow up on that: when would you want to use gmres instead of fgmres > in the outer ksp? > The difference here is just that FGMRES is right-preconditioned by default, so you do not get the extra

Re: [petsc-users] BJACOBI with FIELDSPLIT

2019-03-18 Thread Matthew Knepley via petsc-users
On Mon, Mar 18, 2019 at 3:18 PM Rossi, Simone via petsc-users < petsc-users@mcs.anl.gov> wrote: > Dear all, > > I'm debugging my application in which I'm trying to use the FIELDSPLIT > preconditioner for solving a 2x2 block matrix. > > > Currently I'm testing the preconditioner on a decoupled

Re: [petsc-users] Cross-compilation cluster

2019-03-14 Thread Matthew Knepley via petsc-users
It is very dangerous to use different compilers. I would make sure that all the compilers are the MPI compilers. Thanks, Matt On Thu, Mar 14, 2019 at 8:46 AM Amneet Bhalla wrote: > Ah, Ok. Do serial compilers look OK to you? > > Can lib-32 and lib-64 (say -lm) operate simulataneously

Re: [petsc-users] Problems about SNES

2019-03-12 Thread Matthew Knepley via petsc-users
On Wed, Jan 16, 2019 at 10:59 PM Yingjie Wu via petsc-users < petsc-users@mcs.anl.gov> wrote: > Dear PETSc developers: > Hi, > During the process of testing the program, I found some questions about > SNES. These are some basic questions that I have overlooked. Please help me > to answer them. >

Re: [petsc-users] Problem in MatSetValues

2019-03-11 Thread Matthew Knepley via petsc-users
On Mon, Mar 11, 2019 at 8:27 AM Eda Oktay via petsc-users < petsc-users@mcs.anl.gov> wrote: > Hello, > > I have a following part of a code which tries to change the nonzero values > of matrix L with -1. However in MatSetValues line, something happens and > some of the values in matrix turns into

Re: [petsc-users] About DMDA (and extracting its ordering)

2019-02-25 Thread Matthew Knepley via petsc-users
On Mon, Feb 25, 2019 at 3:19 AM Appel, Thibaut wrote: > Hi Matthew, > > Yes I need F90 and the syntax in the file / yours > > PetscInt, pointer :: id_ltog(:) > > > Is the exact same as > > PetscInt, dimension(:), pointer :: id_ltog > > > They’re both modern fortran ”correct” > > Anyways I tried

Re: [petsc-users] Question with filedsplit in PETSc

2019-02-23 Thread Matthew Knepley via petsc-users
On Thu, Feb 21, 2019 at 3:45 PM Zhu, Qiming via petsc-users < petsc-users@mcs.anl.gov> wrote: > > Dear all, > > > Sorry to disturb you. I am a user of Petsc. I am trying to use Fieldsplit > in Petsc to do preconditioning for Navier-Stokes problem. I have some > problems when I trying to use

Re: [petsc-users] Using PETSc in Cray systems

2019-02-21 Thread Matthew Knepley via petsc-users
On Thu, Feb 21, 2019 at 10:46 AM Najib Alia via petsc-users < petsc-users@mcs.anl.gov> wrote: > Dear all, > > we are trying to compile our Finite Element code on a Cray system and > have a problem with PETSc and available packages: "unable to find > scotch64", the variable PETSC_SINGLE_LIBRARY is

Re: [petsc-users] saving results

2019-02-20 Thread Matthew Knepley via petsc-users
On Wed, Feb 20, 2019 at 4:43 AM Sal Am wrote: > Hi Matthew you were right, > > The matrix I have is very ill conditioned and my supervisor gave it for > testing purposes. Having said that, I was able to solve it previously > however, for some reason it said convergence reached at e-3 even though

Re: [petsc-users] Identifying matching points in differently distributed DMPlexs

2019-02-19 Thread Matthew Knepley via petsc-users
On Tue, Feb 19, 2019 at 11:00 AM Lawrence Mitchell via petsc-users < petsc-users@mcs.anl.gov> wrote: > Dear petsc-users, > > I have two (different) distributions of the (topologically) same DMPlex > object (DM_a and DM_b). > > I would like to identify the map from points(DM_a) to points(DM_b)

Re: [petsc-users] MatCompositeMerge + MatCreateRedundantMatrix

2019-02-19 Thread Matthew Knepley via petsc-users
You basically need the inverse of MatCreateSubmatrices(). I do not think we have that right now, but it could probably be done without too much trouble by looking at that code. Thanks, Matt On Tue, Feb 19, 2019 at 6:15 AM Marius Buerkle via petsc-users < petsc-users@mcs.anl.gov> wrote:

Re: [petsc-users] Missing Fortran interface for PetscFree?

2019-02-15 Thread Matthew Knepley via petsc-users
On Fri, Feb 15, 2019 at 6:26 AM Marco Tiberga via petsc-users < petsc-users@mcs.anl.gov> wrote: > Dear PETSc developers, > > > > Unfortunately, I need to continue the previous conversation (which I > attached), because I am getting another compiler error. > > I need to call PetscFree() after

Re: [petsc-users] MWE for DMPlexCreateCGNS

2019-02-05 Thread Matthew Knepley via petsc-users
On Tue, Feb 5, 2019 at 11:13 AM Andrew Parker wrote: > On Tue, 5 Feb 2019 at 15:27, Matthew Knepley wrote: > >> On Tue, Feb 5, 2019 at 9:47 AM Andrew Parker via petsc-users < >> petsc-users@mcs.anl.gov> wrote: >> >>> Does anyone have a MWE for DMPlexCreateCGNS to use in parallel? Ideally, >>>

Re: [petsc-users] Ksp Initial residual norm

2019-02-03 Thread Matthew Knepley via petsc-users
On Sun, Feb 3, 2019 at 2:24 PM Smith, Barry F. via petsc-users < petsc-users@mcs.anl.gov> wrote: > > > > On Feb 3, 2019, at 1:16 PM, Edoardo alinovi > wrote: > > > > Thank you very much Barry for the suggestion. > > > > Unfortunately, I am using Fortran and not C++ . Do you have an > equivalent

Re: [petsc-users] Preconditioning systems of equations with complex numbers

2019-01-31 Thread Matthew Knepley via petsc-users
On Thu, Jan 31, 2019 at 6:22 PM Justin Chang wrote: > Here's IMHO the simplest explanation of the equations I'm trying to solve: > > http://home.eng.iastate.edu/~jdm/ee458_2011/PowerFlowEquations.pdf > > Right now we're just trying to solve eq(5) (in section 1), inverting the > linear Y-bus

Re: [petsc-users] Problem in MPI execution

2019-01-30 Thread Matthew Knepley via petsc-users
On Wed, Jan 30, 2019 at 8:57 AM Fazlul Huq wrote: > Thanks Matt. > > Is there anyway to go over this problem? > I need to run program with parallel Cholesky and ILU. > >From the link I sent you, you can try MUMPS and PasTiX for Cholesky. Matt > Thanks. > Sincerely, > Huq > > On Wed, Jan

Re: [petsc-users] Problem in MPI execution

2019-01-30 Thread Matthew Knepley via petsc-users
On Wed, Jan 30, 2019 at 8:25 AM Fazlul Huq via petsc-users < petsc-users@mcs.anl.gov> wrote: > Hello PETSc Developers, > > I am trying to run my code with the following commands: > $PETSC_DIR/$PETSC_ARCH/bin/mpiexec -n 6 ./poisson_m -n $x -pc_type hypre > -pc_hypre_type boomeramg >

Re: [petsc-users] Integrate PETSC with existing Fortran

2019-01-23 Thread Matthew Knepley via petsc-users
On Wed, Jan 23, 2019 at 1:27 PM Yaxiong Chen wrote: > Hi Matt, > > > I tried to modify the structure of present code and use KSP in the main > program(optimal_mechanical_part). Now the makefile is as following : > > >

Re: [petsc-users] MPI Iterative solver crash on HPC

2019-01-16 Thread Matthew Knepley via petsc-users
On Wed, Jan 16, 2019 at 3:52 AM Sal Am via petsc-users < petsc-users@mcs.anl.gov> wrote: > The memory requested is an insane number. You may need to use 64 bit >> integers. > > Thanks Mark, I reconfigured it to use 64bit, however in the process it > says I can no longer use MUMPS and SuperLU as

Re: [petsc-users] Is there easy to update the ghost value in the vector created using DMCreatLocalVector?

2019-01-16 Thread Matthew Knepley via petsc-users
On Tue, Jan 15, 2019 at 8:34 PM leejearl via petsc-users < petsc-users@mcs.anl.gov> wrote: > Hi all Petscer, > > I have a question about how to update the ghost value in the vector. > The prolblem is as follow. > > 1. A dmplex dm object is created using DMPlexCreateFromFile > 2. The dm is

Re: [petsc-users] PETSC for singular system

2019-01-14 Thread Matthew Knepley via petsc-users
On Mon, Jan 14, 2019 at 7:55 AM Yaxiong Chen wrote: > So must I figure out the index of the zero columns and row to get the > null space first, And then I remove it to generator Cholesky or LU > preconditionor?Is this case, should the nontrivial null space be (1,0,0,0)? > No 1) If you have a

Re: [petsc-users] C++ compilation error

2019-01-13 Thread Matthew Knepley via petsc-users
On Sun, Jan 13, 2019 at 2:58 PM Choudhary, Devyani D via petsc-users < petsc-users@mcs.anl.gov> wrote: > Hi, > > > I am trying to make a simple hello world script using a makefile that > includes petsc, and am getting the error > > "g++: error: unrecognized command line option ‘-wd1572’" > > I am

Re: [petsc-users] Any reason for API change: DMGetWorkArray()

2019-01-10 Thread Matthew Knepley via petsc-users
On Thu, Jan 10, 2019 at 5:31 PM Fande Kong wrote: > Thanks, Matt, > > And then what is the reason to remove PetscDataType? I am out of curiosity. > Occam's Razor: "one should not increase, beyond what is necessary, the number of entities required to explain anything" Matt > DMGetWorkArray

Re: [petsc-users] Data types for local ID's and global ID's for large problems

2019-01-09 Thread Matthew Knepley via petsc-users
On Wed, Jan 9, 2019 at 5:53 PM Weston, Brian Thomas via petsc-users < petsc-users@mcs.anl.gov> wrote: > Does PETSc have different data types for local ID’s and global ID’s? For > example, if PETSc is configured for 32-bit indices on very large problems > with say 10 billion degrees of freedoms

Re: [petsc-users] DMPlexSetRefinementFunction

2019-01-08 Thread Matthew Knepley via petsc-users
On Mon, Jan 7, 2019 at 4:27 PM David Fuentes wrote: > ha! thanks for you time on this Matt. I'm trying to generate a mesh from > image segmentation data. > I would like to use an image segmentation to guide the refinement. Figure > 25 of this paper -

Re: [petsc-users] DMPlex H27 elements

2019-01-05 Thread Matthew Knepley via petsc-users
On Sat, Jan 5, 2019 at 4:04 AM Yann Jobic wrote: > > On 05/01/2019 02:36, Matthew Knepley wrote: > > On Fri, Jan 4, 2019 at 10:04 AM Yann Jobic via petsc-users < > petsc-users@mcs.anl.gov> wrote: > >> Dear Petsc Users, >> >> I'm using DMPlexCreateFromCellList to create my DM. I would like to

Re: [petsc-users] Problems about Picard and NolinearGS

2019-01-03 Thread Matthew Knepley via petsc-users
On Thu, Jan 3, 2019 at 7:36 AM Yingjie Wu via petsc-users < petsc-users@mcs.anl.gov> wrote: > Thanks for your reply. > I read the article you provided. This is my first contact with the > quasi-Newton method. > I have some problem: > 1. From the point of view of algorithm, the quasi-Newton method

Re: [petsc-users] Create a DM given sets of IS's

2018-12-31 Thread Matthew Knepley via petsc-users
On Mon, Dec 31, 2018 at 2:40 AM Justin Chang via petsc-users < petsc-users@mcs.anl.gov> wrote: > Hi all, > > I am solving a six field battery problem (concentration and potential for > each of the two solid and one electrolyte domains) and I want to experiment > with nested/recursice

Re: [petsc-users] GAMG scaling

2018-12-21 Thread Matthew Knepley via petsc-users
On Fri, Dec 21, 2018 at 12:55 PM Zhang, Hong wrote: > Matt: > >> Does anyone know how to profile memory usage? >>> >> >> The best serial way is to use Massif, which is part of valgrind. I think >> it might work in parallel if you >> only look at one process at a time. >> > > Can you give an

Re: [petsc-users] GAMG scaling

2018-12-21 Thread Matthew Knepley via petsc-users
On Fri, Dec 21, 2018 at 11:36 AM Zhang, Hong via petsc-users < petsc-users@mcs.anl.gov> wrote: > Fande: > I will explore it and get back to you. > Does anyone know how to profile memory usage? > The best serial way is to use Massif, which is part of valgrind. I think it might work in parallel if

Re: [petsc-users] Problems about Preconditioner in SNES

2018-12-19 Thread Matthew Knepley via petsc-users
On Wed, Dec 19, 2018 at 10:08 AM Yingjie Wu via petsc-users < petsc-users@mcs.anl.gov> wrote: > Respected Petsc developers: > Hi, > Recently, I developed a two-dimensional non-linear equations solver (SNES) > using Petsc to solve temperature, velocity and pressure fields. Since the > coupling

Re: [petsc-users] Dynamically resize the existing PetscVector

2018-12-17 Thread Matthew Knepley via petsc-users
On Mon, Dec 17, 2018 at 10:14 AM Shidi Yan via petsc-users < petsc-users@mcs.anl.gov> wrote: > Hello, > > I am working on adaptive moving mesh problems. Therefore, the petsc > vector size is constantly changing. > The way I am currently dealing with this change is destroy the petsc > vector >

Re: [petsc-users] Periodic domains in DMPlex / petsc4py

2018-12-12 Thread Matthew Knepley via petsc-users
On Wed, Dec 12, 2018 at 9:12 AM Artur Palha Da Silva Clérigo - LR < a.palhadasilvacler...@tudelft.nl> wrote: > Dear Matt, > > Thank you for your quick reply. As for your question: "why would there be > a difference? They should be numbered in the same way.” > > I see the degrees of freedom living

Re: [petsc-users] Periodic domains in DMPlex / petsc4py

2018-12-12 Thread Matthew Knepley via petsc-users
On Wed, Dec 12, 2018 at 9:01 AM Artur Palha Da Silva Clérigo - LR via petsc-users wrote: > Dear All, > > I having been trying to add periodic domain functionality to my code. I am > able to generate a periodic mesh with gmsh and load it using dmplex. The > problem is that I am unable to have a

Re: [petsc-users] How to output VTK file with Vec associating with DMPLEX

2018-12-09 Thread Matthew Knepley via petsc-users
On Sun, Dec 9, 2018 at 7:31 PM Tsung-Hsing Chen wrote: > Vertex-based is what I want. > Here I don't think I have created any PetscSection yet. > Do I need to create the PetscSection, > Yes. There is a manual chapter with this in it. > if I already have a mesh file from gmsh? > That is just

Re: [petsc-users] VecLoad hdf5

2018-12-07 Thread Matthew Knepley via petsc-users
On Fri, Dec 7, 2018 at 4:17 PM Josh L wrote: > I call VecSetBlockSizes before VecSetSizes instead of after, then it is > working fine now. > It can run with any number of processor. > Is there any reason for this? I am using petsc/3.10 and Fortran. > Yes. VecSetSizes() calculates the division

Re: [petsc-users] Fortran: undefined reference to petscsfdistributesection

2018-12-03 Thread Matthew Knepley via petsc-users
On Mon, Dec 3, 2018 at 3:40 PM Danyang Su wrote: > > On 2018-12-03 12:03 p.m., Matthew Knepley wrote: > > On Mon, Dec 3, 2018 at 2:27 PM Danyang Su wrote: > >> Hi Matt, >> >> Thanks. >> >> BTW: DmPlexGetVertexNumbering now can work using the latest develop >> version. But the index is not in

Re: [petsc-users] create a block matrix from existing petsc matrices and/or vectors

2018-12-01 Thread Matthew Knepley via petsc-users
On Fri, Nov 30, 2018 at 10:19 AM NENNIG Benoit via petsc-users < petsc-users@mcs.anl.gov> wrote: > Dear petsc users, > > I have parallel matrix A (mpiaij) and I would like to create a matrix B > like > B = [A v > wT 0 ] > where wT and v are vectors. > A is involved in eigenvalue computation

Re: [petsc-users] RAW binary write

2018-11-30 Thread Matthew Knepley via petsc-users
On Fri, Nov 30, 2018 at 4:03 AM Sal Am wrote: > Hi Matthew, > > by raw I mean something the equivalent of pure C++ like > > std::fstream fout("Vector_b.bin",std::ios::out | std::ios::binary); > fout.write((char*)[i],sizeof(std::complex)); > fout.close(); //std::vector< std::complex > b > > i.e.

Re: [petsc-users] Error: DM global to natural SF was not created when DMSetUseNatural has already been called

2018-11-28 Thread Matthew Knepley via petsc-users
On Wed, Nov 28, 2018 at 8:58 PM Danyang Su via petsc-users < petsc-users@mcs.anl.gov> wrote: > Dear All, > > I got the following error when using DMPlexGlobalToNatural function > using 1 processor. > We do not create that mapping on 1 proc because the orderings are the same. Reordering happens

Re: [petsc-users] Solving complex linear sparse matrix in parallel + external library

2018-11-28 Thread Matthew Knepley via petsc-users
On Wed, Nov 28, 2018 at 10:26 AM Sal Am wrote: > Thank you indeed --download-mpich and using PETSC_ARCH/bin/mpiexec seems > to work. > > Now I am wondering about the other problem namely getting the residual, is > the residual only computed when using iterative solvers? Cause using > richardson

Re: [petsc-users] Configure Mumps with 64-bit integers

2018-11-22 Thread Matthew Knepley via petsc-users
This should be sent to the MUMPS list. So far we did not see a way to tell them about the integer type. They might say they rely on Fortran to do it, but we can't since that is unreliable from configure. Thanks, Matt On Thu, Nov 22, 2018 at 3:41 AM Najib Alia via petsc-users <

Re: [petsc-users] GAMG Parallel Performance

2018-11-15 Thread Matthew Knepley via petsc-users
On Thu, Nov 15, 2018 at 11:52 AM Karin via petsc-users < petsc-users@mcs.anl.gov> wrote: > Dear PETSc team, > > I am solving a linear transient dynamic problem, based on a discretization > with finite elements. To do that, I am using FGMRES with GAMG as a > preconditioner. I consider here 10 time

Re: [petsc-users] petsc4py help with parallel execution

2018-11-15 Thread Matthew Knepley via petsc-users
On Thu, Nov 15, 2018 at 11:59 AM Ivan Voznyuk wrote: > Hi Matthew, > > Does it mean that by using just command python3 simple_code.py (without > mpiexec) you *cannot* obtain a parallel execution? > As I wrote before, its not impossible. You could be directly calling PMI, but I do not think you

Re: [petsc-users] PETSc (3.9.0) GAMG weak scaling test issue

2018-11-08 Thread Matthew Knepley via petsc-users
On Thu, Nov 8, 2018 at 6:41 AM "Alberto F. Martín" via petsc-users < petsc-users@mcs.anl.gov> wrote: > Dear Mark, > > thanks for your quick and comprehensive reply. > > Before moving to the results of the experiments that u suggested, let me > clarify two points > on my original e-mail and your

Re: [petsc-users] TAOIPM for AC optimal power flow

2018-11-05 Thread Matthew Knepley via petsc-users
On Mon, Nov 5, 2018 at 3:23 PM Justin Chang via petsc-users < petsc-users@mcs.anl.gov> wrote: > Hi everyone, > > I am working on a generic AC optimal power flow solver, and I hope to use > DMNetwork's data structure and TAO's optimization solvers for this purpose. > Last time I inquired about IPM

Re: [petsc-users] Force SNES diverge

2018-11-05 Thread Matthew Knepley via petsc-users
On Mon, Nov 5, 2018 at 8:41 AM Karol Lewandowski via petsc-users < petsc-users@mcs.anl.gov> wrote: > Hi, > > I am solving a highly nonlinear problem using SNES solver. Under certain > conditions during the iterations I already know that the step will diverge > (in the next few iterations). Is

Re: [petsc-users] Segmentation violation

2018-10-30 Thread Matthew Knepley via petsc-users
On Tue, Oct 30, 2018 at 1:18 PM Santiago Andres Triana via petsc-users < petsc-users@mcs.anl.gov> wrote: > Hi petsc-users, > > I am solving a generalized eigenvalue problem using ex7 in > $SLEPC_DIR/src/eps/examples/tutorials/. I provide the A and B matrices. > The program runs fine, with correct