Got it. Thank you for your explanation!
Best wishes,
Zongze
On Mon, 15 May 2023 at 23:28, Matthew Knepley wrote:
> On Mon, May 15, 2023 at 9:55 AM Zongze Yang wrote:
>
>> On Mon, 15 May 2023 at 17:24, Matthew Knepley wrote:
>>
>>> On Sun, May 14, 2023 at 7:23 PM Zongze Yang
>>> wrote:
>>>
Thank you Matt and Samar,
Seems the segfaults I see are related to icc, which is not being updated
anymore. The recommended intel C compiler is icx which is not released for
Macs. I compiled the lib with gcc 13 + openmpi from homebrew and the tests are
passing just fine in optimized mode. I
Hi Marcos,
Yes, I compiled with clang instead of icc (no particular reason for this; I
tend to use gcc/clang). I use mpich4.1.1, which I first built with clang and
ifort:
FC=ifort
./configure --prefix=/usr/local/mpich4 --enable-two-level-namespace
Samar
On May 15, 2023, at 6:07 PM,
On Mon, May 15, 2023 at 1:04 PM Vanella, Marcos (Fed) <
marcos.vane...@nist.gov> wrote:
> Hi Matt, attached is the file.
>
Okay, you are failing in this function
PetscErrorCode PetscGetArchType(char str[], size_t slen)
{
PetscFunctionBegin;
#if defined(PETSC_ARCH)
Hi Samar, what MPI library do you use? Did you compile it with clang instead of
icc?
Thanks,
Marcos
From: Samar Khatiwala
Sent: Monday, May 15, 2023 1:05 PM
To: Matthew Knepley
Cc: Vanella, Marcos (Fed) ; petsc-users@mcs.anl.gov
Subject: Re: [petsc-users]
Hi, for what it’s worth, clang + ifort from OneAPI 2023 update 1 works fine for
me on both Intel and M2 Macs. So it might just be a matter of upgrading.
Samar
On May 15, 2023, at 5:53 PM, Matthew Knepley wrote:
Send us
$PETSC_ARCH/include/petscconf.h
Thanks,
Matt
On Mon, May 15,
Hi Matt, attached is the file.
Thanks!
Marcos
From: Matthew Knepley
Sent: Monday, May 15, 2023 12:53 PM
To: Vanella, Marcos (Fed)
Cc: petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and
OpenMPI
Send us
Hi Satish, yes the -m64 flag tells the compilers the target cpu is intel 64.
The only reason I'm trying to get PETSc working with intel is that the bundles
for the software we release use Intel compilers for Linux, Mac and Windows
(OneAPI intelMPI for linux and Windows, OpenMPI compiled with
Send us
$PETSC_ARCH/include/petscconf.h
Thanks,
Matt
On Mon, May 15, 2023 at 12:49 PM Vanella, Marcos (Fed) <
marcos.vane...@nist.gov> wrote:
> Hi Matt, I configured the lib like this:
>
> $ ./configure --with-blaslapack-dir=/opt/intel/oneapi/mkl/2022.2.1
> --with-debugging=0
Hi Matt, I configured the lib like this:
$ ./configure --with-blaslapack-dir=/opt/intel/oneapi/mkl/2022.2.1
--with-debugging=0 --with-shared-libraries=0 --download-make
and compiled. I still get some check segfault error. See below:
$ make PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1
Ops - for some reason I assumed this build is on Mac M1. [likely due to the
usage of '-m64' - that was strange]..
But yeah - our general usage on Mac is with xcode/clang and brew gfortran (on
both Intel and ARM CPUs) - and unless you need Intel compilers for specific
needs - clang/gfortran
Hi Satish, well turns out this is not an M1 Mac, it is an older Intel Mac
(2019).
I'm trying to get a local computer to do development and tests, but I also have
access to linux clusters with GPU which we plan to go to next.
Thanks for the suggestion, I might also try compiling a gcc/gfortran
Thank you Matt I'll try this and let you know.
Marcos
From: Matthew Knepley
Sent: Monday, May 15, 2023 12:08 PM
To: Vanella, Marcos (Fed)
Cc: petsc-users@mcs.anl.gov
Subject: Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and
OpenMPI
On Mon, May
I see Intel compilers here are building x86_64 binaries - that get run on the
Arm M1 CPU - perhaps there are issues here with this mode of usage..
> I'm starting to work with PETSc. Our plan is to use the linear solver from
> PETSc for the Poisson equation on our numerical scheme and test this
On Mon, May 15, 2023 at 11:19 AM Vanella, Marcos (Fed) via petsc-users <
petsc-users@mcs.anl.gov> wrote:
> Hello, I'm trying to compile the PETSc library version 3.19.1 with OpenMPI
> 4.1.4 and the OneAPI 2022 Update 2 Intel Compiler suite on a Mac with OSX
> Ventura 13.3.1.
> I can compile PETSc
On Mon, May 15, 2023 at 9:55 AM Zongze Yang wrote:
> On Mon, 15 May 2023 at 17:24, Matthew Knepley wrote:
>
>> On Sun, May 14, 2023 at 7:23 PM Zongze Yang wrote:
>>
>>> Could you try to project the coordinates into the continuity space by
>>> enabling the option
>>>
Hello, I'm trying to compile the PETSc library version 3.19.1 with OpenMPI
4.1.4 and the OneAPI 2022 Update 2 Intel Compiler suite on a Mac with OSX
Ventura 13.3.1.
I can compile PETSc in debug mode with this configure and make lines. I can run
the PETSC tests, which seem fine.
When I compile
BDDC is a domain decomposition solver of the non-overlapping type and
cannot be used on assembled operators.
If you want to use it, you need to restructure your code a bit.
I presume from your message that your current approach is
1) generate_assembled_csr
2) decompose_csr? or decompose_mesh?
3)
Hello sir,
I am a PhD student and am trying to use the PCBDDC method in petsc to solve
the matrix, but the final result is wrong. So I would like to ask you a few
questions.
First I will describe the flow of my code, I first used the finite element
method to build the total matrix in CSR format
On Mon, 15 May 2023 at 17:24, Matthew Knepley wrote:
> On Sun, May 14, 2023 at 7:23 PM Zongze Yang wrote:
>
>> Could you try to project the coordinates into the continuity space by
>> enabling the option
>> `-dm_plex_gmsh_project_petscdualspace_lagrange_continuity true`?
>>
>
> There is a
On Mon, May 15, 2023 at 9:30 AM Jed Brown wrote:
> Matthew Knepley writes:
>
> > On Fri, May 5, 2023 at 10:55 AM Vilmer Dahlberg via petsc-users <
> > petsc-users@mcs.anl.gov> wrote:
> >
> >> Hi.
> >>
> >>
> >> I'm trying to read a mesh of higher element order, in this example a
> mesh
> >>
Matthew Knepley writes:
> On Fri, May 5, 2023 at 10:55 AM Vilmer Dahlberg via petsc-users <
> petsc-users@mcs.anl.gov> wrote:
>
>> Hi.
>>
>>
>> I'm trying to read a mesh of higher element order, in this example a mesh
>> consisting of 10-node tetrahedral elements, from gmsh, into PETSC. But It
On Fri, May 5, 2023 at 10:55 AM Vilmer Dahlberg via petsc-users <
petsc-users@mcs.anl.gov> wrote:
> Hi.
>
>
> I'm trying to read a mesh of higher element order, in this example a mesh
> consisting of 10-node tetrahedral elements, from gmsh, into PETSC. But It
> looks like the mesh is not properly
On Sun, May 14, 2023 at 7:23 PM Zongze Yang wrote:
> Could you try to project the coordinates into the continuity space by
> enabling the option
> `-dm_plex_gmsh_project_petscdualspace_lagrange_continuity true`?
>
There is a comment in the code about that:
/* XXX FIXME Requires
24 matches
Mail list logo