Re: [petsc-users] Parallel DMPlex Local to Global Mapping

2022-01-13 Thread Matthew Knepley
On Wed, Jan 12, 2022 at 7:55 PM Ferrand, Jesus A. wrote: > Dear PETSc Team: > > Hi! I'm working on a parallel version of a PETSc script that I wrote in > serial using DMPlex. After calling DMPlexDistribute() each rank is assigned > its own DAG where the points are numbered locally. For example,

[petsc-users] Help letter from PETSc3.6 user

2022-01-13 Thread 佟莹
Dear PETSc developers: Recently the following problem appeared in my code: [0]PETSC ERROR: - Error Message -- [0]PETSC ERROR: Overflow in integer operation:

Re: [petsc-users] Help letter from PETSc3.6 user

2022-01-13 Thread Satish Balay via petsc-users
Try: https://petsc.org/release/faq/#when-should-can-i-use-the-configure-option-with-64-bit-indices Also best to use the current release 3.16 Satish On Thu, 13 Jan 2022, 佟莹 wrote: > Dear PETSc developers: > Recently the following problem appeared in my code: > > > [0]PETSC ERROR:

Re: [petsc-users] PETSc configuration error on macOS Monterey with Intel oneAPI

2022-01-13 Thread Danyang Su
Hi Samar, Yes, with mpich, there is no such error. I will just use this configuration for now. Thanks, Danyang From: Samar Khatiwala Date: Thursday, January 13, 2022 at 1:16 AM To: Danyang Su Cc: PETSc Subject: Re: [petsc-users] PETSc configuration error on macOS Monterey with

Re: [petsc-users] PCSetCoordinates does not set coordinates of sub PC (fieldsplit) objects

2022-01-13 Thread Nicolás Barnafi
Dear all, I have created a first implementation. For now it must be called after setting the fields, eventually I would like to move it to the setup phase. The implementation seems clean, but it is giving me some memory errors (free() corrupted unsorted chunks). You may find the code below.

Re: [petsc-users] PCSetCoordinates does not set coordinates of sub PC (fieldsplit) objects

2022-01-13 Thread Matthew Knepley
On Thu, Jan 13, 2022 at 1:15 PM Nicolás Barnafi wrote: > Dear all, > > I have created a first implementation. For now it must be called after > setting the fields, eventually I would like to move it to the setup phase. > The implementation seems clean, but it is giving me some memory errors >

[petsc-users] Strange CUDA failure with a second petscfinalize with PETSc 3.16

2022-01-13 Thread Hao DONG
Dear All, I have encountered a peculiar problem when fiddling with a code with PETSC 3.16.3 (which worked fine with PETSc 3.15). It is a very straight forward PDE-based optimization code which repeatedly solves a linearized PDE problem with KSP in a subroutine (the rest of the code does not

Re: [petsc-users] PETSc configuration error on macOS Monterey with Intel oneAPI

2022-01-13 Thread Samar Khatiwala
Hi Danyang, Just to reiterate, the presence of -Wl,-flat_namespace *is* the problem. I got rid of it by configuring mpich with --enable-two-level-namespace. I reported this problem to the PETSc folks a few weeks ago and they were going to patch MPICH.py (under