And to have cleaner list - specify the (appropriate) compiler libraries via
LIBS option
balay@pj01:~/petsc$ mpif90 -show
gfortran -I/software/mpich-4.1.1/include -I/software/mpich-4.1.1/include
-L/software/mpich-4.1.1/lib -lmpifort -Wl,-rpath -Wl,/software/mpich-4.1.1/lib
-Wl,--enable-new-dtags
You can try using latest petsc version [3.21] - the list should be a bit more
cleaner with it.
balay@pj01:~/petsc$ ./configure --download-mpich --download-fblaslapack
--download-hdf5 --with-hdf5-fortran-bindings --download-metis
--download-parmetis COPTFLAGS=-O3 CXXOPTFLAGS=-O3 FOPTFLAGS=-O3
-
Dear petsc-users,
I am playing with DMPlexGetCellCoordinates and observing that it
returns correct periodic coordinates for cells, but not for faces.
More precisely, adding
PetscCall(DMPlexGetHeightStratum(dm, 1, &fStart, &fEnd));
for (f = fStart; f < fEnd; ++f) {
const PetscSca
It is not always safe to remove duplicate libraries listed in different
places in the list. Hence we cannot simply always remove them.
Barry
> On May 13, 2024, at 10:00 PM, Runjian Wu wrote:
>
> Thanks for your reply! Since I can manually remove duplicates, how about
> adding a function
It errors in C because the argument is labeled with const, but there does not
seem to be a way in Fortran to indicate an array is read only.
> On May 13, 2024, at 10:21 PM, Runjian Wu wrote:
>
> I only know the "intent(in)" attribute for the dummy arguments.
>
> In the counter function V
First clarification, NO issues happened. I am just curious about the
duplicated libraries while I compile PFLOTRAN with PETSc.
PETSc version: 3.16.2
Configure Options: --configModules=PETSc.Configure
--optionsModule=config.compilerOptions --with-make-np=4 --with-cc=mpicc
--with-cxx=mpicxx --with-
On 14/05/24 1:44 pm, Matthew Knepley wrote:
I wish GMsh was clearer about what is optional:
https://urldefense.us/v3/__https://gmsh.info/doc/texinfo/gmsh.html*MSH-file-format__;Iw!!G_uCfscf7eWS!aCypJMAzwWHAJLXGNJmSthDjbHcU-8_MdsaXZ4d1r1RKyL0bqIv5ZuLmQtV6ve4XcTjapf38-bsdDOLEDAlhWoaaGpmjDCYs$
Thanks for your reply! Since I can manually remove duplicates, how about
adding a function to automatically remove duplicates at the end of
"configure" in the next PETSc version?
Runjian
On 5/13/2024 9:31 PM, Barry Smith wrote:
Because the order of the libraries can be important, it is dif
On Mon, May 13, 2024 at 9:33 PM Adrian Croucher
wrote:
> hi, We often create meshes in GMSH format using the meshio library. This
> works OK if we stick to GMSH file format 2. 2. If we use GMSH file format
> 4. 1, DMPlex can't read them because it expects the "Entities" section to
> be present: [
hi, We often create meshes in GMSH format using the meshio library. This works OK if we stick to GMSH file format 2. 2. If we use GMSH file format 4. 1, DMPlex can't read them because it expects the "Entities" section to be present: [0]PETSC
ZjQcmQRYFpfptBannerStart
Thi
Depending on your mpi mpiexec is not needed so
compute-sanitizer --tool memcheck --leak-check full ./a.out args
may work
> On May 13, 2024, at 8:16 PM, Sreeram R Venkat wrote:
>
> This Message Is From an External Sender
> This message came from outside your organization.
> I am trying t
I am trying to check my program for GPU memory leaks with the
compute-sanitizer tool. If I run my application with:
mpiexec -n 1 compute-sanitizer --tool memcheck --leak-check full ./a.out
args
I get the message:
Error: No attachable process found. compute-sanitizer timed-out.
Adding --target-pro
I also just ran with cupy.linalg.eigvalsh (which wraps cuSOLVER), and it
only took 3.1 seconds. I will probably use this, but it is good to know
about the SLEPc cases if I don't need the full spectrum or have sparse
matrices, etc.
Thanks,
Sreeram
On Mon, May 13, 2024 at 2:13 PM Sreeram R Venkat
Apologies, I accidentally hit "reply" instead of "reply-all."
Thank you for the reference. Actually, I just tested that N ~ 1e4 case
where I had saved the dense matrix to a python-readable format. Using
scipy.linalg.eigvalsh, I got the eigenvalues in ~1.5 minutes. They agree
with the ones I got fr
Please respond to the list. The mpd parameter means "maximum projected dimension". You can think of the projected problem as the "sequential" part of the computation, that is not parallelized ("small" dense eigenproblem). When you run with
ZjQcmQRYFpfptBannerStart
This
Thanks a lot. That is the only issue.
/usr/lib64/openmpi/include doesn't exist.
The configuration can be done without issue with the 2nd method. But when I
did make, the following warning came. I will study your link.
Thanks,
f951: Warning: Nonexistent include directory ‘/usr/lib64/openmpi/inclu
Computing the full spectrum is always an unpleasant task. But if you cannot avoid it, I would suggest that you compute the eigenvalues in two runs: n/2 largest real eigenvalues and n/2 smallest real. If your matrix-vector product is cheap,
ZjQcmQRYFpfptBannerStart
This
On Mon, May 13, 2024 at 1:40 PM Sreeram R Venkat
wrote:
> I have a MatShell object that computes matrix-vector products of a dense
> symmetric matrix of size NxN. The MatShell does not actually form the dense
> matrix, so it is never in memory/storage. For my application, N ranges from
> 1e4 to 1
> Includes: -I/usr/lib64/openmpi/include (*This is not the right include
> directory*)
So this is the only issue? Does this dir not exist? If so, you can try the
following fix:
https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/merge_requests/7545__;!!G_uCfscf7eWS!a7OhWBQn8Zp5RfRnL
I have a MatShell object that computes matrix-vector products of a dense
symmetric matrix of size NxN. The MatShell does not actually form the dense
matrix, so it is never in memory/storage. For my application, N ranges from
1e4 to 1e5.
I want to compute the full spectrum of this matrix. For an ex
On Mon, 13 May 2024, neil liu wrote:
> I also tried the 2nd way, it didn't work.
configure.log attached is successful.
Configure Options: --configModules=PETSc.Configure
--optionsModule=config.compilerOptions --download-fblaslapack
--with-mpi-dir=/usr/lib64/openmpi
Compilers:
C Co
If you are using mpicc/mpif90 as compilers from you pre-installed MPI - you
don't need to list --with-mpi-include, --with-mpi-lib options.
As mentioned - you can do this either with:
[if you have then in PATH] --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90
or:
--with-cc=MPI-DIR/bin/mpicc -
Thanks. I want to use the preinstalled OpenMPI to configure petsc. For the
preinstalled OpenMPI, the lib and include dir are at different places.
Then how should I configure with this preinstalled OpenMPI?
Thanks,
$ mpicc --show
gcc -I/usr/include/openmpi-x86_64 -Wl,-rpath -Wl,/usr/lib64/openmpi/
You are misinterpreting --with-mpi-include --with-mpi-lib options.
Any particular reason you want to use these options instead of mpi compilers?
>>>
balay@p1 /home/balay
$ mpicc -show
gcc -I/home/balay/soft/mpich-4.0.1/include -L/home/balay/soft/mpich-4.0.1/lib
-Wl,-rpath -Wl,/home/balay/so
On Mon, 13 May 2024, neil liu wrote:
> Dear Petsc developers,
>
> I am trying to install Petsc with a preinstalled OpenMPi.
>
> ./configure --download-fblaslapack --with-mpi-dir=/usr/lib64/openmpi
--with-mpi-dir=DIR is a bit unique [wrt other pkg-dir options].
It means:
--with-cc=DIR/bin/mp
Dear Petsc developers,
I am trying to install Petsc with a preinstalled OpenMPi.
./configure --download-fblaslapack --with-mpi-dir=/usr/lib64/openmpi
--with-mpi-incdir=/usr/include/openmpi-x86_64
But the final information shows,
MPI:
Version:3
Includes: -I/usr/lib64/openmpi/include (
I couldn't find a way in Fortran to declare an array as read-only. Is there
such support?
Barry
> On May 13, 2024, at 7:28 AM, Runjian Wu wrote:
>
> This Message Is From an External Sender
> This message came from outside your organization.
> Hi all,
>
> I have a question about VecGet
what version of PETSc? What configure command? What do you have for
PETSC_EXTERNAL_LIB_BASIC?
You can send configure.log for your build to petsc-maint
Generally duplicates should not cause grief. [as one needs them to overcome
circular dependencies].
What issues are you seeing? [send relevant
Because the order of the libraries can be important, it is difficult for
./configure to remove unneeded duplicates automatically.
You can manually remove duplicates by editing
$PETSC_ARCH/lib/petsc/conf/petscvariables after running ./configure
Barry
> On May 13, 2024, at 7:47 AM, R
Hi all,
I have a question about VecGetArrayReadF90(...). If I use
VecGetArrayReadF90(...), I can still write entries into the array like
VecGetArrayF90(...). Is it possible to report an error during compile
process?
Thanks,
Runjian Wu
Hi all,
After I compiled PETSc, I found some duplicated libs in the variable
PETSC_EXTERNAL_LIB_BASIC, e.g., -lm, -lgfortran -lstdc++. I am curious how
it happened and how to remove the duplicates?
Thanks,
Runjian Wu
On Sun, May 12, 2024 at 10:42 PM Adrian Croucher
wrote:
> hi Matt,
> On 11/05/24 4:12 am, Matthew Knepley wrote:
>
> Thanks. I tried it out and the error below was raised, looks like it's
>> when it tries to destroy the viewer. It still runs ok when
>> DMGetOutputSequenceLength() isn't called.
>>
32 matches
Mail list logo