Re: [petsc-users] Question about SuperLU

2022-06-10 Thread Matthew Knepley
On Thu, Jun 9, 2022 at 5:20 PM Jorti, Zakariae via petsc-users < petsc-users@mcs.anl.gov> wrote: > Hi, > > I am solving non-linear problem that has 5 unknowns {ni, T, E, B, V}, and > for the preconditioning part, I am using a FieldSplit preconditioner. At > the last fieldsplit/level, we are left w

Re: [petsc-users] Writing VTK output

2022-06-08 Thread Matthew Knepley
On Wed, Jun 8, 2022 at 11:24 AM Sami BEN ELHAJ SALAH < sami.ben-elhaj-sa...@ensma.fr> wrote: > Yes, the file "sami.vtu" is loaded correctly in paraview and I have the > good output like you. > > In my code, I tried with the same command given in your last answer and I > still have the wrong .vtu f

Re: [petsc-users] PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range

2022-06-07 Thread Matthew Knepley
On Tue, Jun 7, 2022 at 9:51 AM wang yuqi wrote: > Hi, Dear developer: > > I encountered the following problems when I run my code with PETSC-3.5.2: > > > > [46]PETSC ERROR: > > > [46]PETSC ERROR: Caught signal number 11 SEGV

Re: [petsc-users] MatSchurComplementGetPmat voes

2022-06-03 Thread Matthew Knepley
On Fri, Jun 3, 2022 at 9:09 AM Arne Morten Kvarving via petsc-users < petsc-users@mcs.anl.gov> wrote: > Hi! > > I have a Chorin pressure correction solver with consistent pressure > update, i.e. > pressure solve is based on the Schur complement > > E = -A10*ainv(A00)*A01 > > with A10 = divergence,

Re: [petsc-users] Sparse linear system solving

2022-06-03 Thread Matthew Knepley
e being performed. Thanks, Matt > The ksp_monitor out for this running (included 15 iterations) using 36 MPI > processes and a file with the memory bandwidth information (testSpeed) are > also attached. We can provide our C++ script if it is needed. > > Thanks a lot! > Best, &g

Re: [petsc-users] Mat created by DMStag cannot access ghost points

2022-06-02 Thread Matthew Knepley
On Thu, Jun 2, 2022 at 8:59 AM Patrick Sanan wrote: > Thanks, Barry and Changqing! That seems reasonable to me, so I'll make an > MR with that change. > Hi Patrick, In the MR, could you add that option to all places we internally use Preallocator? I think we mean it for those. Thanks,

Re: [petsc-users] Petsc with mingw64

2022-06-02 Thread Matthew Knepley
For any configure error, you need to send configure.log Thanks, Matt On Thu, Jun 2, 2022 at 5:38 AM hamid badi wrote: > Hi, > > I want to compile petsc with openblas & mumps (sequential) under mingw64. > To do so, I compiled openblas and mumps without any problem. But when it > comes to

Re: [petsc-users] Sparse linear system solving

2022-06-01 Thread Matthew Knepley
openMP threads or many MPI > processes) are attached. > > > Thank you! > Best, > Lidia > > On 31.05.2022 15:21, Matthew Knepley wrote: > > I have looked at the local logs. First, you have run problems of size 12 > and 24. As a rule of thumb, you need 10,000 > v

Re: [petsc-users] Question about DMPlexDistribute & distribute mesh over processes

2022-05-31 Thread Matthew Knepley
h/index.html> > > > > Le 29 mai 2022 à 18:02, Sami BEN ELHAJ SALAH < > sami.ben-elhaj-sa...@ensma.fr> a écrit : > > Hi Matthew, > Thank you for this example. It seems exactly what I am looking for. > Thank you again for your help and have a good day. > Sami

Re: [petsc-users] Mat created by DMStag cannot access ghost points

2022-05-31 Thread Matthew Knepley
On Tue, May 31, 2022 at 10:28 AM Ye Changqing wrote: > Dear developers of PETSc, > > I encountered a problem when using the DMStag module. The program could be > executed perfectly in serial, while errors are thrown out in parallel > (using mpiexec). Some rows in Mat cannot be accessed in local p

Re: [petsc-users] PetscSF Object on Distributed DMPlex for Halo Data Exchange

2022-05-31 Thread Matthew Knepley
led internally. >> >> Thanks, >> >> Matt >> >> >>> Thanks, >>> Mike >>> >>> >>> I will also point out that Toby has created a nice example showing how >>>> to create an SF for halo exchange between lo

Re: [petsc-users] PetscSF Object on Distributed DMPlex for Halo Data Exchange

2022-05-31 Thread Matthew Knepley
ode currently cannot find PetscSFCreateRemoteOffsets(). > I believe if you pass in NULL for remoteOffsets, that function will be called internally. Thanks, Matt > Thanks, > Mike > > 2022년 5월 24일 (화) 오후 8:46, Matthew Knepley 님이 작성: > >> I will also point out that Toby has crea

Re: [petsc-users] How to read and write HDF5 in parallel withPETSc

2022-05-31 Thread Matthew Knepley
On Tue, May 31, 2022 at 1:02 AM 冯宏磊 <12132...@mail.sustech.edu.cn> wrote: > my code is below: > ierr = PetscViewerCreate(PETSC_COMM_WORLD,&h5);CHKERRQ(ierr); > ierr = PetscViewerHDF5Open(PETSC_COMM_WORLD,"explicit.h5", > FILE_MODE_WRITE, &h5);CHKERRQ(ierr); > ierr = PetscObjectSetName((PetscObject

Re: [petsc-users] Sparse linear system solving

2022-05-31 Thread Matthew Knepley
I have looked at the local logs. First, you have run problems of size 12 and 24. As a rule of thumb, you need 10,000 variables per process in order to see good speedup. Thanks, Matt On Tue, May 31, 2022 at 8:19 AM Matthew Knepley wrote: > On Tue, May 31, 2022 at 7:39 AM Lidia wr

Re: [petsc-users] Sparse linear system solving

2022-05-31 Thread Matthew Knepley
so threads should not change > anything). > > As Matt said, it is best to start with a PETSc example that does something > like what you want (parallel linear solve, see src/ksp/ksp/tutorials for > examples), and then add your code to it. > That way you get the basic infrastruct

Re: [petsc-users] How to read and write HDF5 in parallel withPETSc

2022-05-30 Thread Matthew Knepley
On Mon, May 30, 2022 at 10:12 PM 冯宏磊 <12132...@mail.sustech.edu.cn> wrote: > Hey there > I'm a new user of PETSc. In use, I want to read and write an HDF5 file in > parallel, but I only found an example of serial reading and writing. How > can I read and write HDF5 in parallel? Can you give me a c

Re: [petsc-users] Sparse linear system solving

2022-05-30 Thread Matthew Knepley
On Mon, May 30, 2022 at 10:12 PM Lidia wrote: > Dear colleagues, > > Is here anyone who have solved big sparse linear matrices using PETSC? > There are lots of publications with this kind of data. Here is one recent one: https://arxiv.org/abs/2204.01722 > We have found NO performance improveme

Re: [petsc-users] Question about DMPlexDistribute & distribute mesh over processes

2022-05-28 Thread Matthew Knepley
On Sat, May 28, 2022 at 2:19 PM Matthew Knepley wrote: > On Sat, May 28, 2022 at 1:35 PM Sami BEN ELHAJ SALAH < > sami.ben-elhaj-sa...@ensma.fr> wrote: > >> Hi Matthew, >> >> Thank you for your response. >> >> I don't have that. My DM object i

Re: [petsc-users] Question about DMPlexDistribute & distribute mesh over processes

2022-05-28 Thread Matthew Knepley
S) > Institut Pprime - ISAE - ENSMA > Mobile: 06.62.51.26.74 > Email: sami.ben-elhaj-sa...@ensma.fr > www.samibenelhajsalah.com > <https://samiben91.github.io/samibenelhajsalah/index.html> > > > > Le 27 mai 2022 à 20:45, Matthew Knepley a écrit : > > On F

Re: [petsc-users] Question about DMPlexDistribute & distribute mesh over processes

2022-05-27 Thread Matthew Knepley
On Fri, May 27, 2022 at 9:42 AM Sami BEN ELHAJ SALAH < sami.ben-elhaj-sa...@ensma.fr> wrote: > Hello Isaac, > > Thank you for your reply! > > Let me confirm that when I use DMCreateMatrix() with the orig_dm, I got my > jacobian_matrix. Also, I have succeeded to solve my system and my solution > wa

Re: [petsc-users] PetscSF Object on Distributed DMPlex for Halo Data Exchange

2022-05-24 Thread Matthew Knepley
I will also point out that Toby has created a nice example showing how to create an SF for halo exchange between local vectors. https://gitlab.com/petsc/petsc/-/merge_requests/5267 Thanks, Matt On Sun, May 22, 2022 at 9:47 PM Matthew Knepley wrote: > On Sun, May 22, 2022 at 4:28

Re: [petsc-users] PetscSF Object on Distributed DMPlex for Halo Data Exchange

2022-05-22 Thread Matthew Knepley
On Sun, May 22, 2022 at 4:28 PM Mike Michell wrote: > Thanks for the reply. The diagram makes sense and is helpful for > understanding 1D representation. > > However, something is still unclear. From your diagram, the number of > roots per process seems to vary according to run arguments, such as

Re: [petsc-users] PetscSF Object on Distributed DMPlex for Halo Data Exchange

2022-05-21 Thread Matthew Knepley
On Fri, May 20, 2022 at 4:45 PM Mike Michell wrote: > Thanks for the reply. > > > "What I want to do is to exchange data (probably just MPI_Reduce)" which > confuses me, because halo exchange is a point-to-point exchange and not a > reduction. Can you clarify? > PetscSFReduceBegin/End seems to b

Re: [petsc-users] Solver/Preconditioner suggestions

2022-05-19 Thread Matthew Knepley
you, > > -Alfredo > > On Thu, May 19, 2022 at 12:31 PM Matthew Knepley > wrote: > >> On Thu, May 19, 2022 at 7:27 AM Alfredo J Duarte Gomez < >> aduar...@utexas.edu> wrote: >> >>> Good afternoon PETSC users, >>> >>> I am looking for s

Re: [petsc-users] Solver/Preconditioner suggestions

2022-05-19 Thread Matthew Knepley
On Thu, May 19, 2022 at 7:27 AM Alfredo J Duarte Gomez wrote: > Good afternoon PETSC users, > > I am looking for some suggestions on preconditioners/solvers. > > Currently, I have a custom preconditioner that solves 4 independent > systems, let's call them A,B,C, and D. > > A is an advective, dif

Re: [petsc-users] DMPlex/PetscSF How to determine if local topology is other rank's ghost?

2022-05-17 Thread Matthew Knepley
On Tue, May 17, 2022 at 6:47 PM Toby Isaac wrote: > A leaf point is attached to a root point (in a star forest there are only > leaves and roots), so that means that a root point would be the point that > owns a degree of freedom and a leaf point would have a ghost value. > > For a "point SF" of

Re: [petsc-users] TS Time Derivative after solved step

2022-05-16 Thread Matthew Knepley
On Mon, May 16, 2022 at 6:48 AM Mark Adams wrote: > You generally want to use > https://petsc.org/main/docs/manualpages/TS/TSMonitorSet/ for > something like this. > TSSetPostStep is for diagnostics. > There are differences between the two but I don't recall them. > Yes, I think this belongs in

Re: [petsc-users] An internal abort when perform LU decomposition.

2022-05-16 Thread Matthew Knepley
On Mon, May 16, 2022 at 5:03 AM Yang Zongze wrote: > Hi, > > > > I am solving a Linear system with LU factorization. But failed with the > following error. > > Is there some suggestions on debugging this error? Thanks! > This appears to be inside MUMPS. I would recommend two things: 1) Get a st

Re: [petsc-users] Convergence issues for SNES NASM

2022-05-12 Thread Matthew Knepley
ing I-node routines > maximum iterations=50, maximum function evaluations=1 > tolerances: relative=1e-08, absolute=1e-50, solution=1e-08 > total number of function evaluations=20 > norm schedule ALWAYS > Jacobian is built using a DMDA local Jacobian > problem ex10 on 2

Re: [petsc-users] Convergence issues for SNES NASM

2022-05-12 Thread Matthew Knepley
Your subdomain solves do not appear to be producing descent whatsoever. Possible reasons: 1) Your subdomain Jacobians are wrong (this is usually the problem) 2) You have some global coupling field for which local solves give no descent. (For this you want nonlinear elimination I think) Tha

Re: [petsc-users] [MEMORY LEAK, INTERFACE BUG] Petsc + Boomeramg + Fieldsplit on ARCHER2

2022-05-12 Thread Matthew Knepley
On Thu, May 12, 2022 at 9:09 AM Karabelas, Elias ( elias.karabe...@uni-graz.at) wrote: > Dear Team, > > I ran into some issues using Petsc with Boomeramg and FieldSplit as PC on > the ARCHER2 cluster. > > These are my options for solving a Navier-Stokes-like system and it ran > fine on other clus

Re: [petsc-users] Mysterious error code 77

2022-05-06 Thread Matthew Knepley
On Fri, May 6, 2022 at 9:28 AM Quentin Chevalier < quentin.cheval...@polytechnique.edu> wrote: > Sorry for forgetting the list. Making two matrices was more of a > precaution then a carefully thought strategy. > > It would seem the MWE as I provided it above (with a setDimensions to > reduce calc

Re: [petsc-users] Preconditioning Diagnostics

2022-05-03 Thread Matthew Knepley
On Tue, May 3, 2022 at 3:28 PM Barry Smith wrote: > > A difficult question with no easy answers. > > First, do you have a restart system so you can save your state just > before your "bad behavior" and run experiments easily at the bad point? > > You could try to use SLEPc to compute the fi

Re: [petsc-users] Quasi newton

2022-05-03 Thread Matthew Knepley
obust {J}acobian lagging in {N}ewton-type methods}, year = {2013}, booktitle = {International Conference on Mathematics and Computational Methods Applied to Nuclear Science and Engineering}, pages = {2554--2565}, petsc_uses={KSP}, } Thanks, Matt > Qi > > > O

Re: [petsc-users] Quasi newton

2022-05-03 Thread Matthew Knepley
On Tue, May 3, 2022 at 2:58 AM Pierre Seize wrote: > Hi, > > If I may, is this what you want ? > > https://petsc.org/main/docs/manualpages/SNES/SNESSetLagJacobian.html Yes, this is a good suggestion. Also, you could implement an approximation to the Jacobian. You could then improve it at each

Re: [petsc-users] GMRES for outer solver

2022-05-02 Thread Matthew Knepley
to initial all variables to 0. Uninitialized vars can be NaN. That is the first place I would look. You can usually find that with compiler warnings. Thanks, Matt > Interrogating the optimized version of the code now… > > > On Mon, May 2, 2022 at 11:11 AM Matthew Knepley

Re: [petsc-users] GMRES for outer solver

2022-05-02 Thread Matthew Knepley
2.html > That turned out to be a bug in their code. Thanks, Matt > On Mon, May 2, 2022 at 7:56 AM Barry Smith wrote: > >> >> >> On May 2, 2022, at 8:12 AM, Matthew Knepley wrote: >> >> On Mon, May 2, 2022 at 12:23 AM Ramakrishnan Thirumalaisamy &l

Re: [petsc-users] different periodicity per dof in DMDA?

2022-05-02 Thread Matthew Knepley
On Mon, May 2, 2022 at 12:23 PM Matteo Semplice < matteo.sempl...@uninsubria.it> wrote: > Thanks! > > On 02/05/2022 18:07, Matthew Knepley wrote: > > On Mon, May 2, 2022 at 11:25 AM Matteo Semplice < > matteo.sempl...@uninsubria.it> wrote: > >> Hi. >>

Re: [petsc-users] different periodicity per dof in DMDA?

2022-05-02 Thread Matthew Knepley
On Mon, May 2, 2022 at 11:25 AM Matteo Semplice < matteo.sempl...@uninsubria.it> wrote: > Hi. > > I know that when I create a DMDA I can select periodic b.c. per grid > direction. > > I am facing a PDE with 2 dofs per node in which one dof has periodic > b.c. in the x direction and the other one p

Re: [petsc-users] GMRES for outer solver

2022-05-02 Thread Matthew Knepley
On Mon, May 2, 2022 at 12:23 AM Ramakrishnan Thirumalaisamy < rthirumalaisam1...@sdsu.edu> wrote: > Thank you. I have a couple of questions. I am solving the low Mach > Navier-Stokes system using a projection preconditioner (pc_shell type) with > GMRES being the outer solver and Richardson being t

Re: [petsc-users] DMPlex Parallel Output and DMPlexCreateSection Crashes when DMPlexCreateGmshFromFile used

2022-04-29 Thread Matthew Knepley
, we would have to be careful that nothing looked directly at the data. 2) We could reverse the points storage in-place. This is a little more intrusive, but everything would work seamlessly. It would take more work to do this in parallel, but not all that much. Thanks,

Re: [petsc-users] DMPlex Parallel Output and DMPlexCreateSection Crashes when DMPlexCreateGmshFromFile used

2022-04-29 Thread Matthew Knepley
On Fri, Apr 29, 2022 at 8:27 AM Mike Michell wrote: > > Thanks for the answers and I agree. Creating dual mesh when the code > starts and uses that dm will be the easiest way. > But it is confusing how to achieve that. The entire DAG of the original > mesh should change, except for vertices. How

Re: [petsc-users] DMPlex Parallel Output and DMPlexCreateSection Crashes when DMPlexCreateGmshFromFile used

2022-04-28 Thread Matthew Knepley
to dm object. Basically, > my vector objects (x, y, vel) are not seen from dm viewer & relevant output > file. Do you have any recommendations? > > Thanks, > Mike > > 2022년 4월 26일 (화) 오후 6:33, Matthew Knepley 님이 작성: > >> On Tue, Apr 26, 2022 at 7:27 PM Mike Michell

Re: [petsc-users] DMPlex Parallel Output and DMPlexCreateSection Crashes when DMPlexCreateGmshFromFile used

2022-04-26 Thread Matthew Knepley
lv); will give you a Vec with the local data in it that can be addressed by mesh point (global vectors too). Also you would be able to communicate this data if the mesh is redistributed, or replicate this data if you overlap cells in parallel. Thanks, Matt > Thanks, > Mike > &g

Re: [petsc-users] DMPlex Parallel Output and DMPlexCreateSection Crashes when DMPlexCreateGmshFromFile used

2022-04-26 Thread Matthew Knepley
ver, the file is okay if I print to "sol.vtu". > From "sol.vtu" I can see the entire field with rank. Is using .vtu format > preferred by petsc? > VTK is generally for debugging, but it should work. I will take a look. VTU and HDF5 are the preferred formats. Thanks,

Re: [petsc-users] DMPlex Parallel Output and DMPlexCreateSection Crashes when DMPlexCreateGmshFromFile used

2022-04-26 Thread Matthew Knepley
parallel, since those checks will only work in serial. I have fixed the code, and added a parallel test. I have attached the new file, but it is also in this MR: https://gitlab.com/petsc/petsc/-/merge_requests/5173 Thanks, Matt > Thanks, > Mike > > 2022년 4월 26일 (화) 오전

Re: [petsc-users] VTK format

2022-04-26 Thread Matthew Knepley
On Tue, Apr 26, 2022 at 8:52 AM Kirill Volyanskiy wrote: > Hello, > There is no VTK format part in the code of the VecView > (src/vec/vec/interface/vector.c) function. Therefore TSMonitorSolutionVTK > doesn't work either. > Yes, VTK requires a mesh, so viewing a generic vector will not produce a

Re: [petsc-users] DMPlex Parallel Output and DMPlexCreateSection Crashes when DMPlexCreateGmshFromFile used

2022-04-26 Thread Matthew Knepley
On Mon, Apr 25, 2022 at 9:41 PM Mike Michell wrote: > Dear PETSc developer team, > > I'm trying to learn DMPlex to build a parallel finite volume code in 2D & > 3D. More specifically, I want to read a grid from .msh file by Gmsh. > For practice, I modified /dm/impls/plex/ex1f90.F90 case to read &

Re: [petsc-users] Error: Invalid MIT-MAGIC-COOKIE-1

2022-04-24 Thread Matthew Knepley
On Sun, Apr 24, 2022 at 7:36 AM Flavio Riche wrote: > Hi, > > I am new to petsc. When I configure PETSC with > > ./configure complex --with-scalar-type=complex --with-cc=gcc > --with-cxx=g++ --with-fc=gfortran --download-scalapack --download-mumps > --download-fftw --download-mpich --download-fbl

Re: [petsc-users] TSAdapt minimum step and exact final time

2022-04-20 Thread Matthew Knepley
On Wed, Apr 20, 2022 at 5:13 PM Phlipot, Greg wrote: > Hello, > > When using TS with the option TS_EXACT_FINALTIME_MATCHSTEP to force TS > to stop at the final time, I'm seeing the adaptive step controller > choose smaller time steps than the minimum time step that is set with > TSAdaptGetStepLim

Re: [petsc-users] CHKERRQ in PETSc 3.17 Fortran

2022-04-18 Thread Matthew Knepley
How did any Fortran tests compile in the CI? Thanks, Matt On Mon, Apr 18, 2022 at 4:09 PM Satish Balay via petsc-users < petsc-users@mcs.anl.gov> wrote: > Its deprecated - but the removal in fortran interface was not intentional. > So its added in > https://gitlab.com/petsc/petsc/-/commi

Re: [petsc-users] Starting in debugger

2022-04-17 Thread Matthew Knepley
e it is functioning. I think no one bothered to really check it out. Matt > For example > > $ brew install valgrind > valgrind: Linux is required for this software. > Error: valgrind: An unsatisfied requirement failed this build. > > > On Apr 17, 2022, at 9:25 PM,

Re: [petsc-users] Starting in debugger

2022-04-17 Thread Matthew Knepley
On Sun, Apr 17, 2022 at 2:59 PM Sanjay Govindjee wrote: > Codesigning is not the issue. My gdb is properly codesigned (here are my > synopsized instructions based off the page your reference but with out the > extraneous details http://feap.berkeley.edu/wiki/index.php?title=GDB). > > I think thi

Re: [petsc-users] Input argument out of range with MatZeroRows

2022-04-17 Thread Matthew Knepley
On Fri, Apr 15, 2022 at 7:07 PM Jennifer Ellen Fromm wrote: > Thank you for your reply, with petsc4py the only error message I get is: > > Traceback (most recent call last): > File "../../../exhume-fenics-prototype/demos/exhume_poisson.py", line > 228, in > solveKSP(dR_b,R_b,u_p, method=LI

Re: [petsc-users] What does PCASMSetOverlap do?

2022-04-14 Thread Matthew Knepley
ains >> -pc_gasm_overlap 4 >> Inner subdomain: >> 0 1 2 3 4 >> Outer subdomain: >> 0 1 2 3 4 5 6 7 8 >> Inner subdomain: >> 5 6 7 8 >> Outer subdomain: >> 0 1 2 3 4 5 6 7 8 >> >> Thanks, >> Pierre >> >> Thank you very m

Re: [petsc-users] What does PCASMSetOverlap do?

2022-04-13 Thread Matthew Knepley
nerate an >> overlap algebraically which is equivalent to the overlap you would have >> gotten geometrically. >> If you know that “geometric” overlap (or want to use a custom definition >> of overlap), you could use >> https://petsc.org/release/docs/manualpages/PC/PCA

Re: [petsc-users] What does PCASMSetOverlap do?

2022-04-13 Thread Matthew Knepley
On Wed, Apr 13, 2022 at 9:11 AM Mark Adams wrote: > > > On Wed, Apr 13, 2022 at 8:56 AM Matthew Knepley wrote: > >> On Wed, Apr 13, 2022 at 6:42 AM Mark Adams wrote: >> >>> No, without overlap you have, let say: >>> core 1: 1:32, 1:32 >>> co

Re: [petsc-users] What does PCASMSetOverlap do?

2022-04-13 Thread Matthew Knepley
On Wed, Apr 13, 2022 at 6:42 AM Mark Adams wrote: > No, without overlap you have, let say: > core 1: 1:32, 1:32 > core 2: 33:64, 33:64 > > Overlap will increase the size of each domain so you get: > core 1: 1:33, 1:33 > core 2: 32:65, 32:65 > I do not think this is correct. Here is the

Re: [petsc-users] Local refinements of tetrahedron elements

2022-04-12 Thread Matthew Knepley
l. > Even here you do not get edge-nested meshes. Matt > Best regards, > Ce > > > > Matthew Knepley 于2022年4月12日周二 18:47写道: > >> On Tue, Apr 12, 2022 at 2:10 AM Ce Qin wrote: >> >>> Thanks for your reply, Matthew. >>> >>> One more

Re: [petsc-users] DMCloning from a DMPlex has changed in Petsc-3.17.0?

2022-04-12 Thread Matthew Knepley
, for custom things, turning off the automatic stuff might be the best option. Thanks, Matt > Thanks, best, Berend. > > > > On 4/12/22 12:49, Matthew Knepley wrote: > > On Tue, Apr 12, 2022 at 2:50 AM Berend van Wachem > > mailto:berend.vanwac...@ovgu.de>&

Re: [petsc-users] DMCloning from a DMPlex has changed in Petsc-3.17.0?

2022-04-12 Thread Matthew Knepley
bute_overlap - The size of the overlap halo from https://petsc.org/main/docs/manualpages/DM/DMSetFromOptions.html Thanks, Matt > Many thanks, best regards, > > Berend. > > > > > On 4/11/22 16:23, Matthew Knepley wrote: > > On Wed, Apr 6, 2022 at 9:41 AM Bere

Re: [petsc-users] [KSP] solveTranspose fails with Strumpack and SuperLU_dist

2022-04-11 Thread Matthew Knepley
On Mon, Apr 11, 2022 at 2:33 PM Jean Marques wrote: > Thank you very much for your inputs. > > Matthew, this LS is a part of a rSVD algorithm (Halko et al, SIAM Review, > 2009), hence I need to compute direct and adjoints system solutions. > The reason I asked was to understand whether direct so

Re: [petsc-users] DMCloning from a DMPlex has changed in Petsc-3.17.0?

2022-04-11 Thread Matthew Knepley
out you have. Also, the call to DMPlexDistribute() here (and the Partitioner calls) are now superfluous. Thanks, Matt > Many thanks for looking into this, best regards, > Berend. > > > > On 4/4/22 23:05, Matthew Knepley wrote: > > On Mon, Apr 4, 2022 at 3:36

Re: [petsc-users] Local refinements of tetrahedron elements

2022-04-11 Thread Matthew Knepley
On Fri, Apr 1, 2022 at 10:14 AM Ce Qin wrote: > Dear all, > > I want to implement the adaptive finite element method using the DMPlex > interface. So I would like to know whether DMPlex supports local (also > hierarchical) refinements of tetrahedron elements. I found that there is an > adaptation

Re: [petsc-users] [KSP] solveTranspose fails with Strumpack and SuperLU_dist

2022-04-09 Thread Matthew Knepley
On Sat, Apr 9, 2022 at 7:41 PM Jean Marques wrote: > Hi all, > > This may be a naive question, and I hope this is the right place to ask > about it. > I need to solve a direct linear system with a sparse matrix R, then an > adjoint system the hermitian of R. > > I use a petsc4py, so what I do is

Re: [petsc-users] error 2 in QAMD : Schur size expected ...

2022-04-08 Thread Matthew Knepley
On Fri, Apr 8, 2022 at 12:57 PM Aleksandra Grudskaia < agru...@mpa-garching.mpg.de> wrote: > Dear PETSC team, > > Sometimes I get the error > > Internal error 2 in QAMD : Schur size expected: 0 Real: 1 > This is an internal error in MUMPS, so we cannot control it. I would submit this to the MUMP

Re: [petsc-users] question

2022-04-07 Thread Matthew Knepley
On Thu, Apr 7, 2022 at 8:16 AM 高亚贺 via petsc-users wrote: > Dear Mr./Ms., > > > I have used ‘DMCreateMatrix’ to create a matrix *K*, and also the > ‘DMCreateGlobalVector’ to create two vectors *U* (to be solved) and *F > *(right-hand > side), i.e. *KU*=*F*. Now, I want to add some complex constr

Re: [petsc-users] Matrix preallocation - d_nz and o_nz

2022-04-07 Thread Matthew Knepley
On Thu, Apr 7, 2022 at 6:12 AM Gabriela Nečasová wrote: > Dear PETSc team, > > I would like to ask you a question about the matrix preallocation. > I am using the routine MatMPIAIJSetPreallocation(). > > Example: The matrix A has the size 18 x 18 with 168 nonzeros: > A = > 106.21 -91.667

Re: [petsc-users] DMCloning from a DMPlex has changed in Petsc-3.17.0?

2022-04-04 Thread Matthew Knepley
On Mon, Apr 4, 2022 at 3:36 PM Berend van Wachem wrote: > Dear Petsc team, > > Since about 2 years we have been using Petsc with DMPlex, but since > upgrading our code to Petsc-3.17.0 something has broken. > > First we generate a DM from a DMPlex with DMPlexCreateFromFile or > creating one with

Re: [petsc-users] MatSetNullSpace results in nan with MUMPS

2022-04-03 Thread Matthew Knepley
On Sat, Apr 2, 2022 at 8:59 PM Bhargav Subramanya < bhargav.subrama...@kaust.edu.sa> wrote: > Dear All, > > I am trying to solve Ax = b in parallel using MUMPS, where x is composed > of velocity, pressure, and temperature. There is a null space due to the > homogeneous Neumann pressure boundary co

Re: [petsc-users] Allocating the diagonal for MatMPIAIJSetPreallocation

2022-04-01 Thread Matthew Knepley
you consider it in the first way it makes > sense that it would be nxn. > The idea here is that the internal structure of P does not matter. it has the same interface as the matrix A, so fom your point of view they are identical. Thanks, Matt > On Fri, Apr 1, 2022 at 12:00 PM Ma

Re: [petsc-users] Allocating the diagonal for MatMPIAIJSetPreallocation

2022-04-01 Thread Matthew Knepley
lues again > a second time to actually set the values of the parallel Mat you actually > use to solve the system? > Yes. Thanks, Matt > On Fri, Apr 1, 2022 at 11:50 AM Matthew Knepley wrote: > >> On Fri, Apr 1, 2022 at 12:45 PM Samuel Estes >> wrote: >>

Re: [petsc-users] Allocating the diagonal for MatMPIAIJSetPreallocation

2022-04-01 Thread Matthew Knepley
< rEnd. So if you know (r, c) for each nonzero, you know whether it is in the diagonal block. Thanks, Matt > On Fri, Apr 1, 2022 at 11:34 AM Matthew Knepley wrote: > >> On Fri, Apr 1, 2022 at 12:27 PM Samuel Estes >> wrote: >> >>> Hi, >>> &g

Re: [petsc-users] Allocating the diagonal for MatMPIAIJSetPreallocation

2022-04-01 Thread Matthew Knepley
On Fri, Apr 1, 2022 at 12:27 PM Samuel Estes wrote: > Hi, > > I have a problem in which I know (roughly) the number of non-zero entries > for each row of a matrix but I don't have a convenient way of determining > whether they belong to the diagonal or off-diagonal part of the parallel > matrix.

Re: [petsc-users] Memory leak when combining PETSc-based vectors and boost::odeint

2022-04-01 Thread Matthew Knepley
gt; *std::cerr << vecSize << '\t' << ierr << '\n';* > *local_vec = PETSC_NULL;* > * }* > > which should set *local_vec* to *PETSC_NULL* as soon as it is no longer > in use. > You must be

Re: [petsc-users] Memory leak when combining PETSc-based vectors and boost::odeint

2022-04-01 Thread Matthew Knepley
/home/roland/Downloads/git-files/petsc/src/vec/vec/interface/rvector.c:1780 > > I do not understand why it tries to access the vector, even though it has > been set to PETSC_NULL in the previous free-call. > > What code is setting that pointer to NULL? Thanks, Matt > Regar

Re: [petsc-users] MatMult method

2022-03-31 Thread Matthew Knepley
On Thu, Mar 31, 2022 at 10:11 AM Medane TCHAKOROM < medane.tchako...@univ-fcomte.fr> wrote: > Hello, > > I got one issue with MatMult method that I do not understand. > > Whenever I multiply Matrix A by vector b (as shown below), the printed > result > > show a value with an exponent that is far a

Re: [petsc-users] Memory leak when combining PETSc-based vectors and boost::odeint

2022-03-31 Thread Matthew Knepley
fferent PETSC_ARCH configures, and switch at runtime with that variable. Thanks, Matt > Regards, > Roland Richter > > Am 31.03.22 um 15:35 schrieb Matthew Knepley: > > On Thu, Mar 31, 2022 at 9:01 AM Roland Richter > wrote: > >> Hei, >> >> Thank

Re: [petsc-users] Memory leak when combining PETSc-based vectors and boost::odeint

2022-03-31 Thread Matthew Knepley
anks, Matt > Regards, > > Roland Richter > Am 31.03.22 um 12:14 schrieb Matthew Knepley: > > On Thu, Mar 31, 2022 at 5:58 AM Roland Richter > wrote: > >> Hei, >> >> For a project I wanted to combine boost::odeint for timestepping and >> PETSc-based

Re: [petsc-users] Memory leak when combining PETSc-based vectors and boost::odeint

2022-03-31 Thread Matthew Knepley
On Thu, Mar 31, 2022 at 5:58 AM Roland Richter wrote: > Hei, > > For a project I wanted to combine boost::odeint for timestepping and > PETSc-based vectors and matrices for calculating the right hand side. As > comparison for both timing and correctness I set up an armadillo-based > right hand si

Re: [petsc-users] DMSwarm

2022-03-23 Thread Matthew Knepley
On Wed, Mar 23, 2022 at 11:09 AM Joauma Marichal < joauma.maric...@uclouvain.be> wrote: > Hello, > > I sent an email last week about an issue I had with DMSwarm but did not > get an answer yet. If there is any other information needed or anything I > could try to solve it, I would be happy to do t

Re: [petsc-users] MatCreateSBAIJ

2022-03-22 Thread Matthew Knepley
gt; On Tue, Mar 22, 2022 at 1:21 PM Matthew Knepley wrote: > >> On Tue, Mar 22, 2022 at 4:16 PM Sam Guo wrote: >> >>> Here is one memory comparison (memory in MB) >>> np=1np=2np=4np=8np=16 >>> shell 1614 1720 1874 1673 1248 >>> PETSc(using full

Re: [petsc-users] MatCreateSBAIJ

2022-03-22 Thread Matthew Knepley
On Tue, Mar 22, 2022 at 4:16 PM Sam Guo wrote: > Here is one memory comparison (memory in MB) > np=1np=2np=4np=8np=16 > shell 1614 1720 1874 1673 1248 > PETSc(using full matrix) 2108 2260 2364 2215 1734 > PETSc(using symmetric matrix) 1750 2100 2189 2094 1727Those are the total > water mark memo

Re: [petsc-users] Null space and preconditioners

2022-03-22 Thread Matthew Knepley
> > > > Thanks again! > Great! I am happy everything is working. Matt > Marco Cisternino > > > > > > *From:* Matthew Knepley > *Sent:* martedì 22 marzo 2022 15:22 > *To:* Marco Cisternino > *Cc:* Barry Smith ; petsc-users@mcs.anl.gov > *Subject:* Re

Re: [petsc-users] Null space and preconditioners

2022-03-22 Thread Matthew Knepley
On Tue, Mar 22, 2022 at 9:55 AM Marco Cisternino < marco.cistern...@optimad.it> wrote: > Thank you Barry! > No, no reason for FGMRES (some old tests showed shorter wall-times > relative to GMRES), I’m going to use GMRES. > I tried GMRES with GAMG using PCSVD on the coarser level on real cases, > l

Re: [petsc-users] PetscSection and DMPlexVTKWriteAll in parallel

2022-03-21 Thread Matthew Knepley
On Mon, Mar 21, 2022 at 11:22 AM Ferrand, Jesus A. wrote: > Greetings. > > I am having trouble exporting a vertex-based solution field to ParaView > when I run my PETSc script in parallel (see screenshots). The smoothly > changing field is produced by my serial runs whereas the "messed up" one is

Re: [petsc-users] Null space and preconditioners

2022-03-21 Thread Matthew Knepley
ull space components can be introduced by the rest of the preconditioner, but when I use range-space smoothers and local interpolation it tends to be much better for me. Maybe it is just my problems. Thanks, Matt > Thank you all. > > > > Marco Cisternino > > > >

Re: [petsc-users] Null space and preconditioners

2022-03-21 Thread Matthew Knepley
On Mon, Mar 21, 2022 at 12:06 PM Mark Adams wrote: > The solution for Neumann problems can "float away" if the constant is not > controlled in some way because floating point errors can introduce it even > if your RHS is exactly orthogonal to it. > > You should use a special coarse grid solver fo

Re: [petsc-users] Regarding the status of VecSetValues(Blocked) for GPU vectors

2022-03-18 Thread Matthew Knepley
id that you can add to for off procesor values and then > you could use the CPU communication in DM. > > > It would be GPU communication, not CPU. > >Matt > > > On Thu, Mar 17, 2022 at 7:19 PM Matthew Knepley wrote: > > On Thu, Mar 17, 2022 at 4:46 PM Sajid

Re: [petsc-users] Regarding the status of VecSetValues(Blocked) for GPU vectors

2022-03-17 Thread Matthew Knepley
unication in DM. > It would be GPU communication, not CPU. Matt > On Thu, Mar 17, 2022 at 7:19 PM Matthew Knepley wrote: > >> On Thu, Mar 17, 2022 at 4:46 PM Sajid Ali Syed wrote: >> >>> Hi PETSc-developers, >>> >>> Is it possible to use VecSetVa

Re: [petsc-users] Regarding the status of VecSetValues(Blocked) for GPU vectors

2022-03-17 Thread Matthew Knepley
On Thu, Mar 17, 2022 at 4:46 PM Sajid Ali Syed wrote: > Hi PETSc-developers, > > Is it possible to use VecSetValues with distributed-memory CUDA & Kokkos > vectors from the device, i.e. can I call VecSetValues with GPU memory > pointers and expect PETSc to figure out how to stash on the device it

Re: [petsc-users] [Ext] Re: Two simple questions on building

2022-03-16 Thread Matthew Knepley
ou would install it anywhere else. Then install PETSc in the container. I have done that for another project and got it to work. Thanks, Matt > Cheers, > > > > Ernesto. > > > > *From:* Matthew Knepley > *Sent:* Wednesday, March 16, 2022 5:45 AM > *To:*

Re: [petsc-users] Two simple questions on building

2022-03-16 Thread Matthew Knepley
On Wed, Mar 16, 2022 at 1:04 AM Ernesto Prudencio via petsc-users < petsc-users@mcs.anl.gov> wrote: > Hi. > > > > I have an application that uses MKL for some convolution operations. Such > MKL functionality uses, I suppose, BLAS/LAPACK underneath. > > > > This same application of mine also uses P

Re: [petsc-users] Arbitrary ownership IS for a matrix

2022-03-16 Thread Matthew Knepley
error at that time. THanks, Matt > Thanks in advance, > Nicolas > > On Thu, Mar 10, 2022 at 2:50 AM Nicolás Barnafi wrote: > > > > Thank you both very much, it is exactly what I needed. > > > > Best regards > > > > On Wed, Mar 9, 2022, 2

Re: [petsc-users] DMSwarm

2022-03-15 Thread Matthew Knepley
On Tue, Mar 15, 2022 at 9:33 AM Joauma Marichal < joauma.maric...@uclouvain.be> wrote: > Hello, > > I am writing to you as I am trying to implement a Lagrangian Particle > Tracking method to my eulerian solver that relies on a 3D DMDA. To that > end, I want to use the DMSwarm library but cannot fi

Re: [petsc-users] How to obtain the local matrix from a global matrix in DMDA?

2022-03-14 Thread Matthew Knepley
On Mon, Mar 14, 2022 at 11:05 AM liluo wrote: > Dear developers, > > > I defined subdomain problems that have some layers of ghost points, which > are exactly of the "local" size of a DA. > > And I want to generate the corresponding submatrix in each subdomain which > contains those layers of gho

Re: [petsc-users] How to obtain the local matrix from a global matrix in DMDA?

2022-03-14 Thread Matthew Knepley
On Mon, Mar 14, 2022 at 10:35 AM liluo wrote: > Dear developers, > > I created a DMDA object and obtain the local vector from a global > vector by using DMGlobalToLocalBegin/End. > > How can I obtain the corresponding local matrix from the global matrix? > > In my case, the global matrix is of MP

Re: [petsc-users] Fieldsplit for 4 fields

2022-03-12 Thread Matthew Knepley
On Fri, Mar 11, 2022 at 3:32 PM Tang, Qi wrote: > Hi, > I am trying to solve a four field system with three nested fieldsplit (I > am in petsc/dmstag directly). I think I have all the IS info in the > original system. I am wondering how to set up IS for the split system. > Some questions first:

Re: [petsc-users] Reuse MUMPS factorization

2022-03-12 Thread Matthew Knepley
On Sat, Mar 12, 2022 at 1:01 PM Bhargav Subramanya < bhargav.subrama...@kaust.edu.sa> wrote: > Dear All, > > I have the following two queries: > > 1. I am running simulations using MUMPS through a job submission system. > Since the job run time is limited, I need to restart the simulations > perio

Re: [petsc-users] Arbitrary ownership IS for a matrix

2022-03-09 Thread Matthew Knepley
On Wed, Mar 9, 2022 at 5:13 PM Barry Smith wrote: > > You need to do a mapping of your global numbering to the standard PETSc > numbering and use the PETSc numbering for all access to vectors and > matrices. > >https://petsc.org/release/docs/manualpages/AO/AOCreate.html provides > one appro

<    4   5   6   7   8   9   10   11   12   13   >