[petsc-users] Agglomeration for Multigrid on Unstructured Meshes

2020-06-01 Thread Danyang Su
Dear All, I recalled there was a presentation ‘Extreme-scale multigrid components with PETSc’ taling about agglomeration in parallel multigrid, with future plan to extend to support unstructured meshes. Is this under development or to be added? Thanks and regards, Danyang

Re: [petsc-users] Agglomeration for Multigrid on Unstructured Meshes

2020-06-01 Thread Danyang Su
ginal global problem or just on the "coarse" problem of an initial geometric hierarchy. Danyang Su writes: > Dear All, > > > > I recalled there was a presentation ‘Extreme-scale multigrid components with PETSc’ taling about agglomeratio

[petsc-users] Bug in ex14f.F90 when debug flags are used?

2020-06-05 Thread Danyang Su
Hi All, I have a question regarding the following example. https://www.mcs.anl.gov/petsc/petsc-current/src/ksp/ksp/tutorials/ex14f.F90.html When debug flags are used in the make file, the code crashed with following error. At line 335 of file ex14f.F90 Fortran runtime error: Inde

Re: [petsc-users] Bug in ex14f.F90 when debug flags are used?

2020-06-05 Thread Danyang Su
xed to use VecGetArrayF90].. Satish On Fri, 5 Jun 2020, Danyang Su wrote: > Hi All, > > > > I have a question regarding the following example. > > https://www.mcs.anl.gov/petsc/petsc-current/src/k

[petsc-users] Parallel writing in HDF5-1.12.0 when some processors have no data to write

2020-06-10 Thread Danyang Su
Hi All, After updating to HDF5-1.12.0, I got some problem if some processors have no data to write or not necessary to write. Since parallel writing is collective, I cannot disable those processors from writing. For the old version, there seems no such problem. So far, the problem only occur

[petsc-users] FW: Parallel writing in HDF5-1.12.0 when some processors have no data to write

2020-06-10 Thread Danyang Su
Hi All, Sorry to send the previous incomplete email accidentally. After updating to HDF5-1.12.0, I got some problem if some processors have no data to write or not necessary to write. Since parallel writing is collective, I cannot disable those processors from writing. For the old versio

Re: [petsc-users] Parallel writing in HDF5-1.12.0 when some processors have no data to write

2020-06-11 Thread Danyang Su
g HDF5 calls that fail or is it PETSc routines calling >HDF5 that fail? > >Regardless it sounds like the easiest fix is to switch back to the >previous HDF5 and wait for HDF5 to fix what sounds to be a bug. > > Barry > > >> On Jun 11, 2020, at 1:05 AM, Danyang Su wr

Re: [petsc-users] Parallel writing in HDF5-1.12.0 when some processors have no data to write

2020-06-11 Thread Danyang Su
pace,& mem_space_id=memspace, xfer_prp = xlist_id) Please let me know if there is something wrong in the code that causes the problem. Thanks, Danyang On 2020-06-11, 8:32 PM, "Jed Brown" wrote: Danyang Su writes: > Hi Barry, > > The HDF5 calls fail. I reconf

Re: [petsc-users] Parallel writing in HDF5-1.12.0 when some processors have no data to write

2020-06-12 Thread Danyang Su
4 + HDF5-1.10.x -> Works fine Centos-7 + Intel2018 + HDF5-12.0 -> Works fine Possible error when code crashes At line 6686 of file H5_gen.F90 Fortran runtime error: Index '1' of dimension 1 of array 'buf' above upper bound of 0 Thanks, Danyang On 2020-06-1

Re: [petsc-users] Parallel writing in HDF5-1.12.0 when some processors have no data to write

2020-06-13 Thread Danyang Su
, "Jed Brown" wrote: Can you reproduce in C? You're missing three extra arguments that exist in the Fortran interface. https://support.hdfgroup.org/HDF5/doc/RM/RM_H5D.html#Dataset-Create Danyang Su writes: > Hi Jed, > > Attached is t

Re: [petsc-users] Parallel writing in HDF5-1.12.0 when some processors have no data to write

2020-06-13 Thread Danyang Su
: Can you reproduce in C? You're missing three extra arguments that exist in the Fortran interface. https://support.hdfgroup.org/HDF5/doc/RM/RM_H5D.html#Dataset-Create Danyang Su writes: > Hi Jed, > > Attached is the example for your test. &

Re: [petsc-users] Quite different behaviours of PETSc solver on different clusters

2020-10-29 Thread Danyang Su
Hi Matt, No, interations from both linear and nonlinear solvers are similar. The system administrator doubt that the latency in mpich makes the difference. We will test a petsc version with OpenMPI on that cluster to check if it makes difference. Thanks, Danyang On October 29, 2020 6:05:53

[petsc-users] Undefined reference in PETSc 3.13+ with old MPI version

2021-04-10 Thread Danyang Su
Dear PETSc developers and users, I am trying to install the latest PETSc version on an ancient cluster. The OpenMPI version is 1.6.5 and Compiler is Intel 14.0, which are the newest on that cluster. I have no problem to install PETSc up to version 3.12.5. However, if I try to use PETSc 3.13+

Re: [petsc-users] Undefined reference in PETSc 3.13+ with old MPI version

2021-04-10 Thread Danyang Su
20 -0700 ======== Thanks, Danyang From: Barry Smith Date: Saturday, April 10, 2021 at 10:31 AM To: Danyang Su Cc: "petsc-users@mcs.anl.gov" Subject: Re: [petsc-users] Undefined reference in PETSc 3.13+

Re: [petsc-users] Undefined reference in PETSc 3.13+ with old MPI version

2021-04-10 Thread Danyang Su
Hi Junchao, Thanks. I will test this branch and get back to you later. All the best, Danyang From: Junchao Zhang Date: Saturday, April 10, 2021 at 3:32 PM To: Danyang Su Cc: Barry Smith , "petsc-users@mcs.anl.gov" Subject: Re: [petsc-users] Undefined reference in

Re: [petsc-users] Undefined reference in PETSc 3.13+ with old MPI version

2021-04-10 Thread Danyang Su
te_dynamic' gmake[4]: *** [ex5f] Error 1 Thanks, Danyang From: Junchao Zhang Date: Saturday, April 10, 2021 at 3:57 PM To: Danyang Su Cc: Barry Smith , "petsc-users@mcs.anl.gov" Subject: Re: [petsc-users] Undefined reference in PETSc 3.13+ with old MPI version You sent

Re: [petsc-users] Undefined reference in PETSc 3.13+ with old MPI version

2021-04-10 Thread Danyang Su
2.1.6 version and it seems working properly. Thanks and have a good rest of the weekend, Danyang From: Danyang Su Date: Saturday, April 10, 2021 at 4:08 PM To: Junchao Zhang Cc: Barry Smith , "petsc-users@mcs.anl.gov" Subject: Re: [petsc-users] Undefined reference in PETSc

Re: [petsc-users] Undefined reference in PETSc 3.13+ with old MPI version

2021-04-11 Thread Danyang Su
newer MPI version. /home/danyangs/soft/petsc/petsc-3.14.6/intel-14.0.2-openmpi-1.6.5/lib/libpetsc.so: undefined reference to `MPI_Iallreduce' Thanks again for all your help, Danyang From: Junchao Zhang Date: Sunday, April 11, 2021 at 7:54 AM To: Danyang Su Cc: Barry Smith , &

[petsc-users] Is old ex10.c (separated matrix and rhs) deprecated?

2022-01-10 Thread Danyang Su
Hi All, Back to PETSc-3.8 version, the example ex10.c supports reading matrix and vector from separated files. Is this feature deprecated in the new PETSc version? I have some matrix and rhs to test but could not use ex10 example under new PETSc version. Thanks, Danyang

[petsc-users] PETSc configuration error on macOS Monterey with Intel oneAPI

2022-01-12 Thread Danyang Su
Hi All, I got an error in PETSc configuration on macOS Monterey with Intel oneAPI using the following options: ./configure --with-cc=icc --with-cxx=icpc --with-fc=ifort --with-blas-lapack-dir=/opt/intel/oneapi/mkl/2022.0.0/lib/ --with-debugging=1 PETSC_ARCH=macos-intel-dbg --download-mumps

Re: [petsc-users] PETSc configuration error on macOS Monterey with Intel oneAPI

2022-01-12 Thread Danyang Su
PM To: Danyang Su Cc: PETSc Subject: Re: [petsc-users] PETSc configuration error on macOS Monterey with Intel oneAPI Hi Danyang, I had trouble configuring PETSc on MacOS Monterey with ifort when using mpich (which I was building myself). I tracked it down to an errant "-Wl,-flat

Re: [petsc-users] PETSc configuration error on macOS Monterey with Intel oneAPI

2022-01-13 Thread Danyang Su
Hi Samar, Yes, with mpich, there is no such error. I will just use this configuration for now. Thanks, Danyang From: Samar Khatiwala Date: Thursday, January 13, 2022 at 1:16 AM To: Danyang Su Cc: PETSc Subject: Re: [petsc-users] PETSc configuration error on macOS Monterey with

[petsc-users] Fortran HDF5 Cannot be found in PETSc-3.16

2022-02-28 Thread Danyang Su
Hi All, Does anyone encounter the problem when HDF5 related fortran code cannot be compiled in PETSc-3.16 because the 'use hdf5' cannot find the required file? Compared to HDF5-1.12.0 and earlier versions, some object files (e.g., hdf5.mod, hdf5.o) are missing in HDF5-1.12.1 in PETSc-3.16. I

Re: [petsc-users] Fortran HDF5 Cannot be found in PETSc-3.16

2022-02-28 Thread Danyang Su
Thanks, Barry. It works now. Danyang On 2022-02-28 9:59 a.m., Barry Smith wrote: You need the additional configure option --download-hdf5-fortran-bindings Please make sure you have the latest 3.16.4 Barry On Feb 28, 2022, at 12:42 PM, Danyang Su wrote: Hi All, Does anyone

[petsc-users] Question on incomplete factorization level and fill

2017-05-24 Thread Danyang Su
Dear All, I use PCFactorSetLevels for ILU and PCFactorSetFill for other preconditioning in my code to help solve the problems that the default option is hard to solve. However, I found the latter one, PCFactorSetFill does not take effect for my problem. The matrices and rhs as well as the sol

Re: [petsc-users] Question on incomplete factorization level and fill

2017-05-24 Thread Danyang Su
the factorization level through hypre? Thanks, Danyang On 17-05-24 04:59 AM, Matthew Knepley wrote: On Wed, May 24, 2017 at 2:21 AM, Danyang Su <mailto:danyang...@gmail.com>> wrote: Dear All, I use PCFactorSetLevels for ILU and PCFactorSetFill for other preconditioning i

Re: [petsc-users] Question on incomplete factorization level and fill

2017-05-24 Thread Danyang Su
with pretty good speedup. And I am not sure if I miss something for this problem. Thanks, Danyang On 17-05-24 11:12 AM, Matthew Knepley wrote: On Wed, May 24, 2017 at 12:50 PM, Danyang Su <mailto:danyang...@gmail.com>> wrote: Hi Matthew and Barry, Thanks for the quick respons

Re: [petsc-users] Question on incomplete factorization level and fill

2017-05-24 Thread Danyang Su
is problem. Thanks, Danyang On 17-05-24 11:12 AM, Matthew Knepley wrote: On Wed, May 24, 2017 at 12:50 PM, Danyang Su mailto:danyang...@gmail.com>> wrote: Hi Matthew and Barry, Thanks for the quick response. I also tried superlu and mumps, both work

Re: [petsc-users] Question on incomplete factorization level and fill

2017-05-24 Thread Danyang Su
16 or 48 processors. The error information is attached. I tested this on my local computer with 6 cores 12 threads. Any suggestion on this? Thanks, Danyang On 17-05-24 12:28 PM, Danyang Su wrote: Hi Hong, Awesome. Thanks for testing the case. I will try your options for the code and g

Re: [petsc-users] Question on incomplete factorization level and fill

2017-05-24 Thread Danyang Su
Hi All, I just delete the .info file and it works without problem now. Thanks, Danyang On 17-05-24 06:32 PM, Hong wrote: Remove your option '-vecload_block_size 10'. Hong On Wed, May 24, 2017 at 3:06 PM, Danyang Su <mailto:danyang...@gmail.com>> wrote: Dear Hong,

[petsc-users] PCFactorSetShiftType does not work in code but -pc_factor_set_shift_type works

2017-05-25 Thread Danyang Su
your option '-vecload_block_size 10'. Hong On Wed, May 24, 2017 at 3:06 PM, Danyang Su <mailto:danyang...@gmail.com>> wrote: Dear Hong, I just tested with different number of processors for the same matrix. It sometimes got "ERROR: Arguments are incompatible&

Re: [petsc-users] PCFactorSetShiftType does not work in code but -pc_factor_set_shift_type works

2017-05-25 Thread Danyang Su
(i=0; i Remove your option '-vecload_block_size 10'. Hong On Wed, May 24, 2017 at 3:06 PM, Danyang Su mailto:danyang...@gmail.com>> wrote: Dear Hong, I just tested with different number of processors for the same matrix. It sometimes got "ERROR: Argum

Re: [petsc-users] Is OpenMP still available for PETSc?

2017-06-30 Thread Danyang Su
PM, Danyang Su wrote: Dear All, I recalled there was OpenMP available for PETSc for the old development version. When google "petsc hybrid mpi openmp", there returned some papers about this feature. My code was first parallelized using OpenMP and then redeveloped using PETSc, with O

Re: [petsc-users] Petsc with OpenMP

2017-08-23 Thread Danyang Su
implemented PETSc. You can still use PETSc DMDA to do domain decomposition but just replace the linear solver for the hybrid version. Regards, Danyang On 17-08-23 03:42 AM, Khai Pham wrote: Hi All, I searched for the information about Petsc with OpenMP support. I came cross Danyang Su question a

Re: [petsc-users] Error running configure on HDF5 in PETSc-3.18.3

2023-01-06 Thread Danyang Su
environment activated. However, since my code requires fortran interface to HDF5, I do need ‘--with-hdf5-fortran-bindings’, otherwise, my code cannot be compiled. Any other suggestions? Thanks, Danyang From: Pierre Jolivet Date: Friday, January 6, 2023 at 7:59 AM To: Danyang Su Cc

Re: [petsc-users] Error running configure on HDF5 in PETSc-3.18.3

2023-01-06 Thread Danyang Su
>>>> Executing: gfortran --version stdout: GNU Fortran (GCC) 8.2.0 <<<< We generally use brew gfortran - and that works with hdf5 aswell balay@ypro ~ % gfortran --version GNU Fortran (Homebrew GCC 11.2.0_1) 11.2.0 Satish On Fri, 6 Jan 2023, Danyang Su wrote: > Hi

Re: [petsc-users] Cmake problem on an old cluster

2023-01-19 Thread Danyang Su
g sourced during the build? Perhaps you could start with a fresh copy of petsc and retry? Also suggest using 'arch-' prefix for PETSC_ARCH i.e 'arch-intel-14.0.2-openmpi-1.6.5' - just in case there are some bugs lurking with skipping build files in this location Satish On

Re: [petsc-users] Cmake problem on an old cluster

2023-01-19 Thread Danyang Su
riable that's set. Glad you have a working build now. Thanks for the update! BTW: superlu-dist requires cmake 3.18.1 or higher. You could check if this older version of cmake builds on this cluster [if you want to give superlu-dist a try again] Satish On Thu, 19 Jan 2023,

Re: [petsc-users] PETSC ERROR in DMGetLocalBoundingBox?

2023-03-04 Thread Danyang Su
Hi Matt, Attached is the source code and example. I have deleted most of the unused source code but it is still a bit length. Sorry about that. The errors come after DMGetLocalBoundingBox and DMGetBoundingBox. -> To compile the code Please type 'make exe' and the executable file petsc_bo

Re: [petsc-users] PETSC ERROR in DMGetLocalBoundingBox?

2023-04-29 Thread Danyang Su
etsc-users] PETSC ERROR in DMGetLocalBoundingBox? Hi Matt, I am following up to check if you can reproduce the problem on your side. Thanks and have a great weekend, Danyang From: Danyang Su Sent: March 4, 2023 4:38 PM To: Matthew Knepley Cc: petsc-users@mcs.anl.gov Subject:

Re: [petsc-users] Fortran preprocessor not work in pets-dev

2023-05-07 Thread Danyang Su
Balay" mailto:ba...@mcs.anl.gov>> wrote: On Sat, 6 May 2023, Danyang Su wrote: > Hi All, > > > > My code has some FPP. It works fine in PETSc 3.18 and earlier version, but > stops working in the latest PETSc-Dev. For example the following FPP > STANDARD_FORTRAN

Re: [petsc-users] Fortran preprocessor not work in pets-dev

2023-05-07 Thread Danyang Su
from: FPPFLAGS = include ${PETSC_DIR}/lib/petsc/conf/variables include ${PETSC_DIR}/lib/petsc/conf/rules to include ${PETSC_DIR}/lib/petsc/conf/variables include ${PETSC_DIR}/lib/petsc/conf/rules FPPFLAGS = But this is fixed in latest release and main branches. Satish On Sun, 7 May 2023

[petsc-users] Question on ISLocalToGlobalMappingGetIndices Fortran Interface

2023-05-08 Thread Danyang Su
Dear PETSc-Users, Is there any changes in ISLocalToGlobalMappingGetIndices function after PETSc 3.17? In the previous PETSc version (<= 3.17), the function ‘ISLocalToGlobalMappingGetIndices(ltogm,ltog,idltog,ierr)’ works fine, even though the value of idltog looks out of bound (-1147265

Re: [petsc-users] Question on ISLocalToGlobalMappingGetIndices Fortran Interface

2023-05-08 Thread Danyang Su
Thanks, Mark. Yes, it actually works when I update to ISLocalToGlobalMappingGetIndicesF90. I made a mistake reporting this does not work. Danyang From: Mark Adams Date: Monday, May 8, 2023 at 7:22 PM To: Danyang Su Cc: petsc-users Subject: Re: [petsc-users] Question on

Re: [petsc-users] Domain decomposition using DMPLEX

2019-11-26 Thread Danyang Su
On 2019-11-25 7:54 p.m., Matthew Knepley wrote: On Mon, Nov 25, 2019 at 6:25 PM Swarnava Ghosh > wrote: Dear PETSc users and developers, I am working with dmplex to distribute a 3D unstructured mesh made of tetrahedrons in a cuboidal domain. I had a few

Re: [petsc-users] Domain decomposition using DMPLEX

2019-11-26 Thread Danyang Su
On 2019-11-26 10:18 a.m., Matthew Knepley wrote: On Tue, Nov 26, 2019 at 11:43 AM Danyang Su <mailto:danyang...@gmail.com>> wrote: On 2019-11-25 7:54 p.m., Matthew Knepley wrote: On Mon, Nov 25, 2019 at 6:25 PM Swarnava Ghosh mailto:swarnav...@gmail.com>> wrote:

Re: [petsc-users] Error in PETSc configuration on Mac Sonoma

2024-06-11 Thread Danyang Su
7; from '/private/var/folders/jm/wcm4mv8s3v1gqz383tcf_4c0gp/T/ccqBB7yf.o' conflicts with definition from dylib '_mpifcmb5_' from '/Users/danyangsu/Soft/PETSc/petsc-3.20.5/macos-gnu-opt/lib/libmpifort.12.dylib' > > Perhaps someone has the fix. > > > On

[petsc-users] Add unstructured grid capability to existing structured grid code

2018-02-14 Thread Danyang Su
Dear All, I have a reactive transport code that was first developed using structured grid and parallelized using PETSc. Both sequential version (with or without PETSc) and parallel version work fine. Recently I have finished the unstructured grid capability for the sequential version. Next st

Re: [petsc-users] Question on DMPlexCreateFromCellList and DMPlexCreateFromFile

2018-02-16 Thread Danyang Su
On 18-02-15 05:57 PM, Matthew Knepley wrote: On Thu, Feb 15, 2018 at 7:40 PM, Danyang Su <mailto:danyang...@gmail.com>> wrote: Hi Matt, I have a question on DMPlexCreateFromCellList and DMPlexCreateFromFile. When use DMPlexCreateFromFile with Gmsh file input, it w

Re: [petsc-users] Question on DMPlexCreateFromCellList and DMPlexCreateFromFile

2018-02-16 Thread Danyang Su
On 18-02-16 10:13 AM, Matthew Knepley wrote: On Fri, Feb 16, 2018 at 11:36 AM, Danyang Su <mailto:danyang...@gmail.com>> wrote: On 18-02-15 05:57 PM, Matthew Knepley wrote: On Thu, Feb 15, 2018 at 7:40 PM, Danyang Su mailto:danyang...@gmail.com>> wrote: Hi M

[petsc-users] Error when use DMPlexGetVertexNumbering

2018-02-16 Thread Danyang Su
Hi Matt, I try to get the global vertex index and cell index from local mesh and run into problem. What I need is local to global index (the original index used in DMPlexCreateFromCellList is best, as user know exactly where the node/cell is) for vertices and cells, which will be used to assi

Re: [petsc-users] Error when use DMPlexGetVertexNumbering

2018-02-16 Thread Danyang Su
On 18-02-16 10:50 AM, Matthew Knepley wrote: On Fri, Feb 16, 2018 at 1:45 PM, Danyang Su <mailto:danyang...@gmail.com>> wrote: Hi Matt, I try to get the global vertex index and cell index from local mesh and run into problem. What I need is local to global index (the

[petsc-users] how to check if cell is local owned in DMPlex

2018-02-19 Thread Danyang Su
Hi Matt, Would you please let me know how to check if a cell is local owned? When overlap is 0 in DMPlexDistribute, all the cells are local owned. How about overlap > 0? It sounds like impossible to check by node because a cell can be local owned even if none of the nodes in this cell is local

[petsc-users] Question on DMPlexCreateSection for Fortran

2018-02-20 Thread Danyang Su
Hi All, I tried to compile the DMPlexCreateSection code but got error information as shown below. Error: Symbol 'petsc_null_is' at (1) has no IMPLICIT type I tried to use PETSC_NULL_OBJECT instead of PETSC_NULL_IS, then the code can be compiled but run into Segmentation Violation error in D

Re: [petsc-users] Question on DMPlexCreateSection for Fortran

2018-02-20 Thread Danyang Su
On 18-02-20 09:52 AM, Matthew Knepley wrote: On Tue, Feb 20, 2018 at 12:30 PM, Danyang Su <mailto:danyang...@gmail.com>> wrote: Hi All, I tried to compile the DMPlexCreateSection code but got error information as shown below. Error: Symbol 'petsc_null_is&

Re: [petsc-users] Question on DMPlexCreateSection for Fortran

2018-02-21 Thread Danyang Su
em. My code is done in a similar way as ex1f90, it reads mesh from external file or creates from cell list, distributes the mesh (these already work), and then creates sections and sets ndof to the nodes. Thanks, Danyang On 18-02-20 10:07 AM, Danyang Su wrote: On 18-02-20 09:52 AM, Matthew Knep

Re: [petsc-users] Question on DMPlexCreateSection for Fortran

2018-02-22 Thread Danyang Su
RQ(ierr) Thanks, Danyang On 18-02-21 09:22 AM, Danyang Su wrote: Hi Matt, To test the Segmentation Violation problem in my code, I modified the example ex1f90.F to reproduce the problem I have in my own code. If use DMPlexCreateBoxMesh to generate the mesh, the code works fine. H

[petsc-users] Cell type for DMPlexCreateFromCellList

2018-02-22 Thread Danyang Su
Hi All, What cell types does DMPlexCreateFromCellList support? I test this with triangle, tetrahedron and prism. Both triangle and tetrahedron work but prism mesh throws error saying "Cone size 6 not supported for dimension 3". Could anyone tell me all the supported cell types? Thanks, Dany

Re: [petsc-users] Cell type for DMPlexCreateFromCellList

2018-02-23 Thread Danyang Su
On 18-02-23 03:04 AM, Matthew Knepley wrote: On Fri, Feb 23, 2018 at 1:33 AM, Danyang Su <mailto:danyang...@gmail.com>> wrote: Hi All, What cell types does DMPlexCreateFromCellList support? I test this with triangle, tetrahedron and prism. Both triangle and tetrahe

[petsc-users] object name overwritten in VecView

2018-02-27 Thread Danyang Su
Hi All, How to set different object names when using multiple VecView? I try to use PetscObjectSetName with multiple output, but the object name is overwritten by the last one. As shown below, as well as the enclosed files as example, the vector name in sol.vtk is vec_v for both vector u and

Re: [petsc-users] object name overwritten in VecView

2018-02-28 Thread Danyang Su
t.h5::append -v_vec_view >hdf5:test.h5::append > >which produces a file > > test.h5 > >Then I run > > $PETSC_DIR/bin/petsc_gen_xdmf.py test.h5 > >which produces another file > > test.xmf > >This can be loaded by Paraview for visualization. > > Thanks

Re: [petsc-users] object name overwritten in VecView

2018-02-28 Thread Danyang Su
all in $PETSC_DIR > On Feb 28, 2018, at 9:07 AM, Danyang Su wrote: > > Hi Matt, > > Thanks for your suggestion and I will use xmf instead. > > Regards, > > Danyang > > On February 28, 2018 3:58:08 AM PST, Matthew Knepley wrote: > On Wed, Feb 28, 20

Re: [petsc-users] how to check if cell is local owned in DMPlex

2018-03-02 Thread Danyang Su
On 18-03-02 10:58 AM, Matthew Knepley wrote: On Fri, Mar 2, 2018 at 1:41 PM, Danyang Su <mailto:danyang...@gmail.com>> wrote: On 18-02-19 03:30 PM, Matthew Knepley wrote: On Mon, Feb 19, 2018 at 3:11 PM, Danyang Su mailto:danyang...@gmail.com>> wrote: Hi Matt,

Re: [petsc-users] how to check if cell is local owned in DMPlex

2018-03-02 Thread Danyang Su
On 18-03-02 10:58 AM, Matthew Knepley wrote: On Fri, Mar 2, 2018 at 1:41 PM, Danyang Su <mailto:danyang...@gmail.com>> wrote: On 18-02-19 03:30 PM, Matthew Knepley wrote: On Mon, Feb 19, 2018 at 3:11 PM, Danyang Su mailto:danyang...@gmail.com>> wrote:

Re: [petsc-users] how to check if cell is local owned in DMPlex

2018-03-02 Thread Danyang Su
2018 at 3:00 PM, Danyang Su <mailto:danyang...@gmail.com>> wrote: On 18-03-02 10:58 AM, Matthew Knepley wrote: On Fri, Mar 2, 2018 at 1:41 PM, Danyang Su mailto:danyang...@gmail.com>> wrote: On 18-02-19 03:30 PM, Matthew Knepley wrote: On Mon, Feb 1

Re: [petsc-users] how to check if cell is local owned in DMPlex

2018-03-04 Thread Danyang Su
On 18-03-04 08:08 AM, Matthew Knepley wrote: On Fri, Mar 2, 2018 at 3:22 PM, Danyang Su <mailto:danyang...@gmail.com>> wrote: Hi Matt, I use the latest Fortran style in PETSc 3.8. Enclosed are the PETSc configuration, code compiling log and the function that causes

Re: [petsc-users] [petsc-maint] how to check if cell is local owned in DMPlex

2018-03-05 Thread Danyang Su
show every use of the function in the source code. Very useful tool Barry On Mar 4, 2018, at 1:05 PM, Danyang Su wrote: On 18-03-04 08:08 AM, Matthew Knepley wrote: On Fri, Mar 2, 2018 at 3:22 PM, Danyang Su wrote: Hi Matt, I use the latest Fortran style in PETSc 3.8. Enclosed are the

Re: [petsc-users] [petsc-maint] how to check if cell is local owned in DMPlex

2018-03-07 Thread Danyang Su
. Thanks, Danyang On 18-03-05 11:50 AM, Smith, Barry F. wrote: MatSolverPackage became MatSolverType On Mar 5, 2018, at 1:35 PM, Danyang Su wrote: Hi Barry and Matt, The compiling problem should be caused by the PETSc version installed on my computer. When updated to PETSc-Dev version

Re: [petsc-users] [petsc-maint] how to check if cell is local owned in DMPlex

2018-03-08 Thread Danyang Su
On 18-03-07 03:54 PM, Jed Brown wrote: Danyang Su writes: Based on my test, this function works fine using current PETSc-dev version, but I cannot get it compiled correctly using other versions for Fortran code, as mentioned in the previous emails. I asked this question because some of the

[petsc-users] DMSetLabelValue takes a lot of time for large domain

2018-04-24 Thread Danyang Su
Hi All, I use DMPlex in unstructured grid code and recently found DMSetLabelValue takes a lot of time for large problem, e.g., num. of cells > 1 million. In my code, I use DMPlexCreateFromCellList () Loop over all cells/nodes{ DMSetLabelValue } DMPlexDistribute The code works fine except

Re: [petsc-users] DMSetLabelValue takes a lot of time for large domain

2018-04-25 Thread Danyang Su
ot;no") (ipoint+1.0)/istart   write(*,*) ipoint, (ipoint+1.0)/istart,"time",MPI_Wtime()     end if   end if   call DMLabelSetValue(label,ipoint,ipoint+1,ierr)       CHKERRQ(ierr)     end do Thanks, Danyang On 2018-04-25 03:16 AM, Matthew

Re: [petsc-users] DMSetLabelValue takes a lot of time for large domain

2018-04-25 Thread Danyang Su
On 2018-04-25 09:47 AM, Matthew Knepley wrote: On Wed, Apr 25, 2018 at 12:40 PM, Danyang Su <mailto:danyang...@gmail.com>> wrote: Hi Matthew, In the worst case, every node/cell may have different label. Do not use Label for this. Its not an appropriate thing. If every

[petsc-users] Get vertex index of each cell in DMPlex after distribution

2018-04-26 Thread Danyang Su
? Would you please give me a hint or functions that I can use. Thanks, Danyang On 18-04-25 02:12 PM, Danyang Su wrote: On 2018-04-25 09:47 AM, Matthew Knepley wrote: On Wed, Apr 25, 2018 at 12:40 PM, Danyang Su <mailto:danyang...@gmail.com>> wrote: Hi Matthew, In the worst ca

Re: [petsc-users] Get vertex index of each cell in DMPlex after distribution

2018-04-27 Thread Danyang Su
On 2018-04-27 04:11 AM, Matthew Knepley wrote: On Fri, Apr 27, 2018 at 2:09 AM, Danyang Su <mailto:danyang...@gmail.com>> wrote: Hi Matt, Sorry if this is a stupid question. In the previous code for unstructured grid, I create labels to mark the original node/cell i

[petsc-users] Segmentation Violation in getting DMPlex coordinates

2018-04-27 Thread Danyang Su
Hi All, I use DMPlex and need to get coordinates back after distribution. However, I always get segmentation violation in getting coords values in the following codes if using multiple processors. If only one processor is used, it works fine. For each processors, the off value starts from 0

Re: [petsc-users] Segmentation Violation in getting DMPlex coordinates

2018-04-28 Thread Danyang Su
, 2018, at 9:19 AM, Matthew Knepley wrote: On Sat, Apr 28, 2018 at 2:08 AM, Danyang Su wrote: Hi All, I use DMPlex and need to get coordinates back after distribution. However, I always get segmentation violation in getting coords values in the following codes if using multiple processors. If

[petsc-users] Makefile for mixed C++ and Fortran code

2018-06-01 Thread Danyang Su
Hi All, My code needs to link to an external C++ library (CGAL). The code is written in Fortran and I have already written interface to let Fortran call C++ function. For the sequential version without PETSc, it can be compiled without problem using the following makefile. The parallel versio

Re: [petsc-users] Makefile for mixed C++ and Fortran code

2018-06-01 Thread Danyang Su
::open(std::string&)’ out.open(strfile); Thanks, Danyang On 18-06-01 10:07 AM, Danyang Su wrote: Hi All, My code needs to link to an external C++ library (CGAL). The code is written in Fortran and I have already written interface to let Fortran call C++ function. For the sequenti

Re: [petsc-users] Makefile for mixed C++ and Fortran code

2018-06-01 Thread Danyang Su
, Barry. Still not work after including ${DLIB} On Jun 1, 2018, at 12:07 PM, Danyang Su wrote: Hi All, My code needs to link to an external C++ library (CGAL). The code is written in Fortran and I have already written interface to let Fortran call C++ function. For the sequential version

Re: [petsc-users] Makefile for mixed C++ and Fortran code

2018-06-02 Thread Danyang Su
r your compile that works and make sure those same flags are used in "PETSc version" of the makefile. You could add the flags directly to the rule %.o:%.cpp $(CLINKER) $(CXXFLAGS) $(CPPFLAGS) -c -frounding-math $< -o $@ Barry On Jun 1, 2018, at 12:37 PM, Danyang Su wrote

Re: [petsc-users] Makefile for mixed C++ and Fortran code

2018-06-02 Thread Danyang Su
. [with correct PETSC_DIR and PETSC_ARCH values] If you have issues - send the complete makefiles - and complete error log.. On Sat, 2 Jun 2018, Danyang Su wrote: Hi Barry, For the code without PETSc, the rules used to compile the code with CGAL is note: DLIB can probably be simlified - and a

[petsc-users] Question on the compiler flags in Makefile

2014-11-30 Thread Danyang Su
Hi All, I have a PETSc application that need additional compiling flags to build Hybrid MPI-OpenMP parallel application on WestGrid Supercomputer (Canada) system. The code and makefile work fine on my local machine for both Windows and Linux, but when compiled on WestGrid Orcinus System for

Re: [petsc-users] Question on the compiler flags in Makefile

2014-12-01 Thread Danyang Su
On 14-12-01 01:48 PM, Satish Balay wrote: On Mon, 1 Dec 2014, Danyang Su wrote: Hi All, I have a PETSc application that need additional compiling flags to build Hybrid MPI-OpenMP parallel application on WestGrid Supercomputer (Canada) system. The code and makefile work fine on my local

[petsc-users] Floating point exception

2015-04-24 Thread Danyang Su
Hi All, One of my case crashes because of floating point exception when using 4 processors, as shown below. But if I run this case with 1 processor, it works fine. I have tested the codes with around 100 cases up to 768 processors, all other cases work fine. I just wonder if this kind of erro

Re: [petsc-users] Floating point exception

2015-04-24 Thread Danyang Su
On 15-04-24 11:12 AM, Barry Smith wrote: On Apr 24, 2015, at 1:05 PM, Danyang Su wrote: Hi All, One of my case crashes because of floating point exception when using 4 processors, as shown below. But if I run this case with 1 processor, it works fine. I have tested the codes with around

Re: [petsc-users] Floating point exception

2015-04-24 Thread Danyang Su
On 15-04-24 01:23 PM, Satish Balay wrote: c 4 1.0976214263087059E-067 I don't think this number can be stored in a real*4. Satish Thanks, Satish. It is caused by this number. On Fri, 24 Apr 2015, Danyang Su wrote: On 15-04-24 11:12 AM, Barry Smith wrote: On Apr 24,

Re: [petsc-users] Floating point exception

2015-04-24 Thread Danyang Su
ebugger [2]PETSC ERROR: likely location of problem given in stack below Thanks, Danyang On 15-04-24 01:54 PM, Danyang Su wrote: On 15-04-24 01:23 PM, Satish Balay wrote: c 4 1.0976214263087059E-067 I don't think this number can be stored in a real*4. Satish Thanks, Satis

Re: [petsc-users] Floating point exception

2015-04-25 Thread Danyang Su
timeloop_ at timeloop.F90:1194 #24 0x5ABFD7 in driver_pc at driver_pc.F90:599 #24 0x5ABFD7 in driver_pc at driver_pc.F90:599 #24 0x5ABFD7 in driver_pc at driver_pc.F90:599 On 15-04-24 11:12 AM, Barry Smith wrote: On Apr 24, 2015, at 1:05 PM, Danyang Su wrote: Hi All, One of my case crashes

Re: [petsc-users] Floating point exception

2015-04-25 Thread Danyang Su
On 15-04-25 11:55 AM, Barry Smith wrote: On Apr 25, 2015, at 1:51 PM, Danyang Su wrote: On 15-04-25 11:32 AM, Barry Smith wrote: I told you this yesterday. It is probably stopping here on a harmless underflow. You need to edit the PETSc code to not worry about underflow. Edit the

Re: [petsc-users] Floating point exception

2015-04-25 Thread Danyang Su
nyang On Apr 25, 2015, at 2:24 PM, Danyang Su wrote: On 15-04-25 11:55 AM, Barry Smith wrote: On Apr 25, 2015, at 1:51 PM, Danyang Su wrote: On 15-04-25 11:32 AM, Barry Smith wrote: I told you this yesterday. It is probably stopping here on a harmless underflow. You need to

Re: [petsc-users] Floating point exception

2015-04-25 Thread Danyang Su
On 15-04-25 06:26 PM, Matthew Knepley wrote: On Sat, Apr 25, 2015 at 8:23 PM, Danyang Su <mailto:danyang...@gmail.com>> wrote: On 15-04-25 06:03 PM, Barry Smith wrote: If this is what you got in your last run at ../../gas_advection/velocity_g

Re: [petsc-users] Floating point exception

2015-04-25 Thread Danyang Su
erhaps you have multiple PETSC_ARCH or multiple PETSc installs to explain why you reported two different places where the exception occurred. On Apr 25, 2015, at 8:31 PM, Danyang Su wrote: On 15-04-25 06:26 PM, Matthew Knepley wrote: On Sat, Apr 25, 2015 at 8:23 PM, Danyang Su wrote: On 1

Re: [petsc-users] Floating point exception

2015-04-27 Thread Danyang Su
. For 3.5.3, i edited fp.c file and then configure and make. Thanks, Danyang On 15-04-25 07:34 PM, Danyang Su wrote: Hi All, The "floating point underflow" is caused by a small value divided by a very large value. This result is forced to zero and then it does not report any underfl

Re: [petsc-users] Floating point exception

2015-04-28 Thread Danyang Su
o the development version of PETSc (that uses the latest version of hypre), here are the instructions on how to obtain it http://www.mcs.anl.gov/petsc/developers/index.html Please let us know if this resolves the problem with hypre failing. Barry On Apr 27, 2015, at 11:44 AM, Danya

Re: [petsc-users] Floating point exception in hypre BoomerAMG

2015-04-28 Thread Danyang Su
x[k] * A[j*n+k]; } } } x[0] /= A[0]; return(err_flag); } } On Apr 28, 2015, at 12:55 PM, Danyang Su wrote: Hi Barry, The development version of PETSc does not help to solve my problem. It still crashed due to the same error information. As Matthew menti

Re: [petsc-users] Floating point exception in hypre BoomerAMG

2015-04-29 Thread Danyang Su
. Thanks, Danyang On Apr 28, 2015, at 7:30 PM, Danyang Su wrote: Hi Barry, There seems another bug (not pretty sure) in PETSc-dev, as shown below. The case I used is similar with the one I mentioned recently. I have no problem running this case using PETSc 3.5.2. But it give out the following

Re: [petsc-users] Floating point exception in hypre BoomerAMG

2015-04-29 Thread Danyang Su
On 15-04-29 11:30 AM, Barry Smith wrote: On Apr 29, 2015, at 12:15 PM, Danyang Su wrote: On 15-04-28 06:50 PM, Barry Smith wrote: We started enforcing more checks on writing to vectors that you should not write to. Where are you calling DMLocalToGlobalBegin() ? It looks like you are

Re: [petsc-users] Floating point exception in hypre BoomerAMG

2015-04-29 Thread Danyang Su
VecGetArrayF90(). All the VecGetArrayF90() come with VecRestoreArrayF90(). No VecGetArray() is used in my codes. Danyang On Apr 29, 2015, at 1:52 PM, Danyang Su wrote: On 15-04-29 11:30 AM, Barry Smith wrote: On Apr 29, 2015, at 12:15 PM, Danyang Su wrote: On 15-04-28 06:50 PM, Barry Smith

Re: [petsc-users] Floating point exception in hypre BoomerAMG

2015-04-29 Thread Danyang Su
ks on VecLockPush or VecLockPop type bt then type cont send all the output, this will tell us where the vector got locked readonly and did not get unlocked. Barry On Apr 29, 2015, at 2:35 PM, Danyang Su wrote: On 15-04-29 12:19 PM, Barry Smith wrote: Ok, your code seems fine in

[petsc-users] Questions on 1 billion unknowns and 64-bit-indices

2015-04-30 Thread Danyang Su
Dear All, I have run my codes successfully with up to 100 million total unknowns using 1000 processors on WestGrid Jasper Cluster, Canada. But when I scale the unknows up to 1 billion, the codes crashes with the following error. It's out of memory. Error message from valgrind output ==9344=

  1   2   >