Re: [petsc-users] DMPlex periodic face coordinates

2024-05-15 Thread Matteo Semplice
Il 14/05/24 15:02, Matthew Knepley ha scritto: On Tue, May 14, 2024 at 12:14 AM Matteo Semplice wrote: Dear petsc-users, I am playing with DMPlexGetCellCoordinates and observing that it returns correct periodic coordinates for cells, but not for faces. More precisely, adding

[petsc-users] DMPlex periodic face coordinates

2024-05-13 Thread Matteo Semplice
Dear petsc-users,     I am playing with DMPlexGetCellCoordinates and observing that it returns correct periodic coordinates for cells, but not for faces. More precisely, adding PetscCall(DMPlexGetHeightStratum(dm, 1, , ));     for (f = fStart; f < fEnd; ++f) {   const PetscScalar *array;

Re: [petsc-users] PETSc error only in debug build

2023-04-17 Thread Matteo Semplice
. Informazioni sul perché è importante <https://aka.ms/LearnAboutSenderIdentification> On 17 Apr 2023, at 6:22 PM, Matteo Semplice wrote: Dear PETSc users,     I am investigating a strange error occurring when using my code on a cluster; I managed to reproduce it on my machine a

[petsc-users] PETSc error only in debug build

2023-04-17 Thread Matteo Semplice
Dear PETSc users,     I am investigating a strange error occurring when using my code on a cluster; I managed to reproduce it on my machine as well and it's weird: - on petsc3.19, optimized build, the code runs fine, serial and parallel - on petsc 3,19, --with=debugging=1, the code crashes

Re: [petsc-users] interpreting data from SNESSolve profiling

2023-02-09 Thread Matteo Semplice
ch time).    Barry On Feb 8, 2023, at 7:56 AM, Matteo Semplice wrote: Dear all,     I am trying to optimize the nonlinear solvers in a code of mine, but I am having a hard time at interpreting the profiling data from the SNES. In particular, if I run with -snesCorr

[petsc-users] interpreting data from SNESSolve profiling

2023-02-08 Thread Matteo Semplice
Dear all,     I am trying to optimize the nonlinear solvers in a code of mine, but I am having a hard time at interpreting the profiling data from the SNES. In particular, if I run with -snesCorr_snes_lag_jacobian 5 -snesCorr_snes_linesearch_monitor -snesCorr_snes_monitor

Re: [petsc-users] locate DMSwarm particles with respect to a background DMDA mesh

2023-01-12 Thread Matteo Semplice
Il 23/12/22 17:14, Matthew Knepley ha scritto: On Thu, Dec 22, 2022 at 3:08 PM Matteo Semplice wrote: Il 22/12/22 20:06, Dave May ha scritto: On Thu 22. Dec 2022 at 10:27, Matteo Semplice wrote: Dear Dave and Matt,     I am really dealing with two different

Re: [petsc-users] locate DMSwarm particles with respect to a background DMDA mesh

2022-12-22 Thread Matteo Semplice
Il 22/12/22 20:06, Dave May ha scritto: On Thu 22. Dec 2022 at 10:27, Matteo Semplice wrote: Dear Dave and Matt,     I am really dealing with two different use cases in a code that will compute a levelset function passing through a large set of points. If I had

Re: [petsc-users] locate DMSwarm particles with respect to a background DMDA mesh

2022-12-22 Thread Matteo Semplice
them back). Thanks     Matteo Il 22/12/22 18:40, Dave May ha scritto: Hey Matt, On Thu 22. Dec 2022 at 05:02, Matthew Knepley wrote: On Thu, Dec 22, 2022 at 6:28 AM Matteo Semplice wrote: Dear all     please ignore my previous email and read this one: I have

Re: [petsc-users] locate DMSwarm particles with respect to a background DMDA mesh

2022-12-22 Thread Matteo Semplice
Il 22/12/22 14:02, Matthew Knepley ha scritto: On Thu, Dec 22, 2022 at 6:28 AM Matteo Semplice wrote: Dear all     please ignore my previous email and read this one: I have better localized the problem. Maybe DMSwarmMigrate is designed to migrate particles only to first

Re: [petsc-users] locate DMSwarm particles with respect to a background DMDA mesh

2022-12-22 Thread Matteo Semplice
Dear all     please ignore my previous email and read this one: I have better localized the problem. Maybe DMSwarmMigrate is designed to migrate particles only to first neighbouring ranks? Il 22/12/22 11:44, Matteo Semplice ha scritto: Dear everybody,     I have bug a bit into the code

Re: [petsc-users] locate DMSwarm particles with respect to a background DMDA mesh

2022-12-22 Thread Matteo Semplice
Dear everybody,     I have bug a bit into the code and I am able to add more information. Il 02/12/22 12:48, Matteo Semplice ha scritto: Hi. I am sorry to take this up again, but further tests show that it's not right yet. Il 04/11/22 12:48, Matthew Knepley ha scritto: On Fri, Nov 4, 2022

Re: [petsc-users] locate DMSwarm particles with respect to a background DMDA mesh

2022-12-02 Thread Matteo Semplice
Hi. I am sorry to take this up again, but further tests show that it's not right yet. Il 04/11/22 12:48, Matthew Knepley ha scritto: On Fri, Nov 4, 2022 at 7:46 AM Matteo Semplice wrote: On 04/11/2022 02:43, Matthew Knepley wrote: On Thu, Nov 3, 2022 at 8:36 PM Matthew Knepley

[petsc-users] localToGlobal with MIN_VALUES ?

2022-11-30 Thread Matteo Semplice
Hi. In DMLocalToGlobal only INSERT_VALUES or ADD_VALUES appear to be allowed. Is there a way to perform localToGlobal (or localtolocal) communications inserting the minimum value instead? Best     Matteo

Re: [petsc-users] locate DMSwarm particles with respect to a background DMDA mesh

2022-11-04 Thread Matteo Semplice
d4df3ec9208dabe060043%7C9252ed8bdffc401c86ca6237da9991fa%7C0%7C0%7C638031230248558895%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C=NUrLdRYvJVG9FK%2B66ku2yE6gNsX5xDMsccNsdhSQXHA%3D=0> -- Prof. Matteo Semplice Università degli Studi dell’Insubria

Re: [petsc-users] different periodicity per dof in DMDA?

2022-05-02 Thread Matteo Semplice
Thanks! On 02/05/2022 18:07, Matthew Knepley wrote: On Mon, May 2, 2022 at 11:25 AM Matteo Semplice wrote: Hi. I am facing a PDE with 2 dofs per node in which one dof has periodic b.c. in the x direction and the other one periodic b.c. in the y direction

[petsc-users] different periodicity per dof in DMDA?

2022-05-02 Thread Matteo Semplice
Hi. I know that when I create a DMDA I can select periodic b.c. per grid direction. I am facing a PDE with 2 dofs per node in which one dof has periodic b.c. in the x direction and the other one periodic b.c. in the y direction. Is there a (possibly quick-and-dirty) solution to represent

Re: [petsc-users] VecView DMDA and HDF5 - Unable to write out files properly

2021-10-14 Thread Matteo Semplice
Il 14/10/21 14:37, Matthew Knepley ha scritto: On Wed, Oct 13, 2021 at 6:30 PM Abhishek G.S. mailto:gsabhishek1...@gmail.com>> wrote: Hi, I need some help with getting the file output working right. I am using a DMDACreate3D to initialize my DM. This is my write function

Re: [petsc-users] Mat preallocation for SNES jacobian [WAS Re: Mat preallocation in case of variable stencils]

2021-09-07 Thread Matteo Semplice
he incorrect default behavior after the fact will work.   Barry On Sep 6, 2021, at 7:34 PM, Matthew Knepley <mailto:knep...@gmail.com>> wrote: On Mon, Sep 6, 2021 at 12:22 PM Matteo Semplice <mailto:matteo.sempl...@uninsubria.it>> wrote: Il 31/08/21 17:32, Jed B

[petsc-users] Mat preallocation for SNES jacobian [WAS Re: Mat preallocation in case of variable stencils]

2021-09-06 Thread Matteo Semplice
Il 31/08/21 17:32, Jed Brown ha scritto: Matteo Semplice writes: Hi. We are writing a code for a FD scheme on an irregular domain and thus the local stencil is quite variable: we have inner nodes, boundary nodes and inactive nodes, each with their own stencil type and offset with respect

[petsc-users] Mat preallocation in case of variable stencils

2021-08-31 Thread Matteo Semplice
Hi. We are writing a code for a FD scheme on an irregular domain and thus the local stencil is quite variable: we have inner nodes, boundary nodes and inactive nodes, each with their own stencil type and offset with respect to the grid node. We currently create a matrix with DMCreateMatrix

Re: [petsc-users] parallel HDF5 output of DMDA data with dof>1

2021-07-23 Thread Matteo Semplice
s will be fine for DMDA but I cannot say if it is appropriate for all types of DMs in all circumstances. Barry On Jul 15, 2021, at 10:44 AM, Matteo Semplice wrote: Hi. When I write (HDF5 viewer) a vector associated to a DMDA with 1 dof, the output is independent of the number of cpus used. Ho

Re: [petsc-users] parallel HDF5 output of DMDA data with dof>1

2021-07-21 Thread Matteo Semplice
ersion of VecView, at least when the Vec is associated with a DMDA. Of course it might just be that I didn't manage to write a correct xdmf, but I can't spot the mistake... I am of course available to run tests in order to find/fix this problem. Best     Matteo On 16/07/21 12:27, Matteo Semplice wrote

Re: [petsc-users] parallel HDF5 output of DMDA data with dof>1

2021-07-16 Thread Matteo Semplice
Il 15/07/21 17:44, Matteo Semplice ha scritto: Hi. When I write (HDF5 viewer) a vector associated to a DMDA with 1 dof, the output is independent of the number of cpus used. However, for a DMDA with dof=2, the output seems to be correct when I run on 1 or 2 cpus, but is scrambled when I

[petsc-users] parallel HDF5 output of DMDA data with dof>1

2021-07-15 Thread Matteo Semplice
Hi. When I write (HDF5 viewer) a vector associated to a DMDA with 1 dof, the output is independent of the number of cpus used. However, for a DMDA with dof=2, the output seems to be correct when I run on 1 or 2 cpus, but is scrambled when I run with 4 cpus. Judging from the ranges of the

Re: [petsc-users] output DMDA to hdf5 file?

2021-07-15 Thread Matteo Semplice
Il 15/07/21 14:26, Matthew Knepley ha scritto: On Thu, Jul 15, 2021 at 8:20 AM Matteo Semplice mailto:matteo.sempl...@uninsubria.it>> wrote: Il 15/07/21 14:15, Matthew Knepley ha scritto: On Thu, Jul 15, 2021 at 6:39 AM Matteo Semplice mailto:matteo.sempl...@uninsub

Re: [petsc-users] output DMDA to hdf5 file?

2021-07-15 Thread Matteo Semplice
Il 15/07/21 14:15, Matthew Knepley ha scritto: On Thu, Jul 15, 2021 at 6:39 AM Matteo Semplice mailto:matteo.sempl...@uninsubria.it>> wrote: Il 12/07/21 17:51, Matthew Knepley ha scritto: On Mon, Jul 12, 2021 at 11:40 AM Matteo Semplice mailto:matteo.sempl...@uninsub

Re: [petsc-users] output DMDA to hdf5 file?

2021-07-15 Thread Matteo Semplice
Il 12/07/21 17:51, Matthew Knepley ha scritto: On Mon, Jul 12, 2021 at 11:40 AM Matteo Semplice mailto:matteo.sempl...@uninsubria.it>> wrote: Dear all,     I am experimenting with hdf5+xdmf output. At https://www.xdmf.org/index.php/XDMF_Model_and_Format <http

[petsc-users] output DMDA to hdf5 file?

2021-07-12 Thread Matteo Semplice
Dear all,     I am experimenting with hdf5+xdmf output. At https://www.xdmf.org/index.php/XDMF_Model_and_Format I read that "XDMF uses XML to store Light data and to describe the data Model. Either HDF5[3] or binary files can be used to store Heavy data. The

[petsc-users] best way to output in parallel data from DMDA (levelset) finite difference simulation

2021-07-09 Thread Matteo Semplice
Dear all,     it seems it should be a fairly straighforward thing to do but I am struggling with the output of my finite difference simulation. I have tried adapting my XML ascii VTK output routines that work nicely for finite volumes (but I have issues at points where 4 CPU subdomains

Re: [petsc-users] MatNest with Shell blocks for multipysics

2021-07-01 Thread Matteo Semplice
Thank you, Matthew and Barry! I can now see a way forward. Il 01/07/21 21:42, Barry Smith ha scritto: I do not understand how creating a DMDA with n0+n1 dofs will let me easily reuse my shell preconditioner code on the top-left block. PCFIELDSPLIT (and friends) do not order the dof by

Re: [petsc-users] MatNest with Shell blocks for multipysics

2021-07-01 Thread Matteo Semplice
Il 01/07/21 17:52, Jed Brown ha scritto: I think ex28 is better organization of code. You can DMCreateMatrix() and then set types/preallocation for off-diagonal blocks of the MatNest. I think the comment is unclear and not quite what was intended and originally worked (which was to assemble

[petsc-users] MatNest with Shell blocks for multipysics

2021-07-01 Thread Matteo Semplice
Hi. We are designing a PETSc application that will employ a SNES solver on a multiphysics problem whose jacobian will have a 2x2 block form, say A=[A00,A01;A10,A11]. We already have code for the top left block A_00 (a MatShell and a related Shell preconditioner) that we wish to reuse. We

Re: [petsc-users] shell preconditioner for Schur complement

2021-02-11 Thread Matteo Semplice
are now set and we can concentrate on our routine. Thanks a lot Matthew and Barry! Matteo & Elena -- Prof. Matteo Semplice Università degli Studi dell’Insubria Dipartimento di Scienza e Alta Tecnologia – DiSAT Professore Associato Via Valleggio, 11 – 22100 Como (CO) – Italia tel.: +39 031 2386316

[petsc-users] shell preconditioner for Schur complement

2021-02-10 Thread Matteo Semplice
Dear PETSc users,     we are trying to program a preconditioner for the Schur complement of a Stokes system, but it seems that the r.h.s. for the Schur complement system differs from what we expect by a scale factor, which we don't understand. Our setup has a system matrix A divided in 2x2

Re: [petsc-users] removing cells from existing DMPLEX

2018-06-09 Thread Matteo Semplice
On 08/06/2018 20:43, Matthew Knepley wrote: On Fri, Jun 8, 2018 at 5:10 AM, Matteo Semplice mailto:matteo.sempl...@unito.it>> wrote: Hi. We are developing a ghost-fluid type scheme, so we start creating a (cartesian) grid and then "mark" some cells in which t

[petsc-users] removing cells from existing DMPLEX

2018-06-08 Thread Matteo Semplice
Hi. We are developing a ghost-fluid type scheme, so we start creating a (cartesian) grid and then "mark" some cells in which the evolution actually takes place and solve a PDE only there. We currently handle the grid via a DMPLEX, we through a Label that marks the fluid cells we define a

[petsc-users] petsc_gen_xdmf.py error

2018-03-14 Thread Matteo Semplice
ve me any insight on this? (Later I will need to output and visualize in Paraview more than one Vec and they are associated different DA's. Is creating an h5 file for each DA and output to that DM all the Vec associated to that DA the preferred way to go?) Best,     Matteo Semplice

[petsc-users] local/global Vec for SNES function/jacobian

2018-01-08 Thread Matteo Semplice
with the same x argument? Thanks,     Matteo Semplice

Re: [petsc-users] preallocation after DMCreateMatrix?

2017-12-05 Thread Matteo Semplice
On 04/12/2017 17:01, Matthew Knepley wrote: On Fri, Dec 1, 2017 at 5:50 AM, Matteo Semplice <matteo.sempl...@unito.it <mailto:matteo.sempl...@unito.it>> wrote: Thanks for the fix! (If you need a volunteer for testing the bug-fix, drop me a line) Cool. Its in next, a

Re: [petsc-users] preallocation after DMCreateMatrix?

2017-12-01 Thread Matteo Semplice
the code go for now.   Thanks,      Matt On Wed, Nov 29, 2017 at 6:36 AM, Matteo Semplice <matteo.sempl...@unito.it <mailto:matteo.sempl...@unito.it>> wrote: On 29/11/2017 12:46, Matthew Knepley wrote: On Wed, Nov 29, 2017 at 2:26 AM, Matteo Semplice <matteo.se

Re: [petsc-users] preallocation after DMCreateMatrix?

2017-11-29 Thread Matteo Semplice
On 29/11/2017 12:46, Matthew Knepley wrote: On Wed, Nov 29, 2017 at 2:26 AM, Matteo Semplice <matteo.sempl...@unito.it <mailto:matteo.sempl...@unito.it>> wrote: On 25/11/2017 02:05, Matthew Knepley wrote: On Fri, Nov 24, 2017 at 4:21 PM, Matteo Semplice <matteo.se

Re: [petsc-users] preallocation after DMCreateMatrix?

2017-11-29 Thread Matteo Semplice
On 25/11/2017 02:05, Matthew Knepley wrote: On Fri, Nov 24, 2017 at 4:21 PM, Matteo Semplice <matteo.sempl...@unito.it <mailto:matteo.sempl...@unito.it>> wrote: Hi. The manual for DMCreateMatrix says "Notes: This properly preallocates the number of nonzeros in the

[petsc-users] preallocation after DMCreateMatrix?

2017-11-24 Thread Matteo Semplice
Hi. The manual for DMCreateMatrix says "Notes: This properly preallocates the number of nonzeros in the sparse matrix so you do not need to do it yourself", so I got the impression that one does not need to call the preallocation routine for the matrix and indeed in most examples listed in

Re: [petsc-users] indices into Vec/Mat associated to a DMPlex

2017-11-15 Thread Matteo Semplice
On 15/11/2017 11:39, Matthew Knepley wrote: On Wed, Nov 15, 2017 at 3:11 AM, Matteo Semplice <matteo.sempl...@unito.it <mailto:matteo.sempl...@unito.it>> wrote: Hi. I am struggling with indices into matrices associated to a DMPLex mesh. I can explain my problem to t

[petsc-users] indices into Vec/Mat associated to a DMPlex

2017-11-15 Thread Matteo Semplice
Hi. I am struggling with indices into matrices associated to a DMPLex mesh. I can explain my problem to the following minimal example. Let's say I want to assemble the matrix to solve an equation (say Laplace) with data attached to cells and the finite volume method. In principle I - loop

[petsc-users] DMPlex distribution and loops over cells/faces for finite volumes

2017-11-09 Thread Matteo Semplice
Hi. I am using a DMPLex to store a grid (at present 2d symplicial but will need to work more in general), but after distributing it, I am finding it hard to organize my loops over the cells/faces. First things first: the mesh distribution. I do   DMPlexSetAdjacencyUseCone(dm, PETSC_TRUE);  

saving parallel vectors

2006-11-24 Thread Matteo Semplice
? Matteo On Thu, 23 Nov 2006, Barry Smith wrote: NEVER use ascii for large data sets. Use the binary viewer to save them. See PetscViewerBinaryOpen(). Barry On Thu, 23 Nov 2006, Matteo Semplice wrote: This is really a newbie question but I am struggling to solve out of memory troubles

saving parallel vectors

2006-11-23 Thread Matteo Semplice
This is really a newbie question but I am struggling to solve out of memory troubles on my first processor. I have a parallel global vector, say v, that I obtain with a call to DAGetGlobalVector. I save it to disk opening an ASCII standard viewer on PETSC_COMM_WORLD and calling VecView. As I