Re: [petsc-users] Question on ISLocalToGlobalMappingGetIndices Fortran Interface

2023-05-08 Thread Danyang Su
Thanks, Mark. Yes, it actually works when I update to ISLocalToGlobalMappingGetIndicesF90. I made a mistake reporting this does not work. Danyang From: Mark Adams Date: Monday, May 8, 2023 at 7:22 PM To: Danyang Su Cc: petsc-users Subject: Re: [petsc-users] Question

[petsc-users] Question on ISLocalToGlobalMappingGetIndices Fortran Interface

2023-05-08 Thread Danyang Su
Dear PETSc-Users, Is there any changes in ISLocalToGlobalMappingGetIndices function after PETSc 3.17? In the previous PETSc version (<= 3.17), the function ‘ISLocalToGlobalMappingGetIndices(ltogm,ltog,idltog,ierr)’ works fine, even though the value of idltog looks out of bound

Re: [petsc-users] Fortran preprocessor not work in pets-dev

2023-05-08 Thread Danyang Su
PFLAGS = include ${PETSC_DIR}/lib/petsc/conf/variables include ${PETSC_DIR}/lib/petsc/conf/rules to include ${PETSC_DIR}/lib/petsc/conf/variables include ${PETSC_DIR}/lib/petsc/conf/rules FPPFLAGS = But this is fixed in latest release and main branches. Satish On Sun, 7 May 2023, Danyang

Re: [petsc-users] Fortran preprocessor not work in pets-dev

2023-05-07 Thread Danyang Su
Balay" mailto:ba...@mcs.anl.gov>> wrote: On Sat, 6 May 2023, Danyang Su wrote: > Hi All, > > > > My code has some FPP. It works fine in PETSc 3.18 and earlier version, but > stops working in the latest PETSc-Dev. For example the following FPP > STANDARD_FORTRAN

Re: [petsc-users] PETSC ERROR in DMGetLocalBoundingBox?

2023-04-30 Thread Danyang Su
] PETSC ERROR in DMGetLocalBoundingBox? Hi Matt, I am following up to check if you can reproduce the problem on your side. Thanks and have a great weekend, Danyang From: Danyang Su Sent: March 4, 2023 4:38 PM To: Matthew Knepley Cc: petsc-users@mcs.anl.gov Subject: Re: [petsc

Re: [petsc-users] PETSC ERROR in DMGetLocalBoundingBox?

2023-03-04 Thread Danyang Su
Hi Matt, Attached is the source code and example. I have deleted most of the unused source code but it is still a bit length. Sorry about that. The errors come after DMGetLocalBoundingBox and DMGetBoundingBox. -> To compile the code Please type 'make exe' and the executable file

Re: [petsc-users] Cmake problem on an old cluster

2023-01-19 Thread Danyang Su
riable that's set. Glad you have a working build now. Thanks for the update! BTW: superlu-dist requires cmake 3.18.1 or higher. You could check if this older version of cmake builds on this cluster [if you want to give superlu-dist a try again] Satish On Thu, 19 Jan 2023, Danya

Re: [petsc-users] Cmake problem on an old cluster

2023-01-19 Thread Danyang Su
the build? Perhaps you could start with a fresh copy of petsc and retry? Also suggest using 'arch-' prefix for PETSC_ARCH i.e 'arch-intel-14.0.2-openmpi-1.6.5' - just in case there are some bugs lurking with skipping build files in this location Satish On Thu, 19 Jan 2023, Danyang Su wrote: Hi

Re: [petsc-users] Error running configure on HDF5 in PETSc-3.18.3

2023-01-06 Thread Danyang Su
>>>> Executing: gfortran --version stdout: GNU Fortran (GCC) 8.2.0 <<<< We generally use brew gfortran - and that works with hdf5 aswell balay@ypro ~ % gfortran --version GNU Fortran (Homebrew GCC 11.2.0_1) 11.2.0 Satish On Fri, 6 Jan 2023, Danyang Su wrote: > Hi

Re: [petsc-users] Error running configure on HDF5 in PETSc-3.18.3

2023-01-06 Thread Danyang Su
environment activated. However, since my code requires fortran interface to HDF5, I do need ‘--with-hdf5-fortran-bindings’, otherwise, my code cannot be compiled. Any other suggestions? Thanks, Danyang From: Pierre Jolivet Date: Friday, January 6, 2023 at 7:59 AM To: Danyang Su Cc

Re: [petsc-users] Fortran HDF5 Cannot be found in PETSc-3.16

2022-02-28 Thread Danyang Su
Thanks, Barry. It works now. Danyang On 2022-02-28 9:59 a.m., Barry Smith wrote: You need the additional configure option --download-hdf5-fortran-bindings Please make sure you have the latest 3.16.4 Barry On Feb 28, 2022, at 12:42 PM, Danyang Su wrote: Hi All, Does anyone

[petsc-users] Fortran HDF5 Cannot be found in PETSc-3.16

2022-02-28 Thread Danyang Su
Hi All, Does anyone encounter the problem when HDF5 related fortran code cannot be compiled in PETSc-3.16 because the 'use hdf5' cannot find the required file? Compared to HDF5-1.12.0 and earlier versions, some object files (e.g., hdf5.mod, hdf5.o) are missing in HDF5-1.12.1 in PETSc-3.16.

Re: [petsc-users] PETSc configuration error on macOS Monterey with Intel oneAPI

2022-01-13 Thread Danyang Su
Hi Samar, Yes, with mpich, there is no such error. I will just use this configuration for now. Thanks, Danyang From: Samar Khatiwala Date: Thursday, January 13, 2022 at 1:16 AM To: Danyang Su Cc: PETSc Subject: Re: [petsc-users] PETSc configuration error on macOS Monterey

Re: [petsc-users] PETSc configuration error on macOS Monterey with Intel oneAPI

2022-01-12 Thread Danyang Su
01 PM To: Danyang Su Cc: PETSc Subject: Re: [petsc-users] PETSc configuration error on macOS Monterey with Intel oneAPI Hi Danyang, I had trouble configuring PETSc on MacOS Monterey with ifort when using mpich (which I was building myself). I tracked it down to an errant "-Wl,-flat

[petsc-users] PETSc configuration error on macOS Monterey with Intel oneAPI

2022-01-12 Thread Danyang Su
Hi All, I got an error in PETSc configuration on macOS Monterey with Intel oneAPI using the following options: ./configure --with-cc=icc --with-cxx=icpc --with-fc=ifort --with-blas-lapack-dir=/opt/intel/oneapi/mkl/2022.0.0/lib/ --with-debugging=1 PETSC_ARCH=macos-intel-dbg --download-mumps

[petsc-users] Is old ex10.c (separated matrix and rhs) deprecated?

2022-01-10 Thread Danyang Su
Hi All, Back to PETSc-3.8 version, the example ex10.c supports reading matrix and vector from separated files. Is this feature deprecated in the new PETSc version? I have some matrix and rhs to test but could not use ex10 example under new PETSc version. Thanks, Danyang

Re: [petsc-users] Undefined reference in PETSc 3.13+ with old MPI version

2021-04-11 Thread Danyang Su
a newer MPI version. /home/danyangs/soft/petsc/petsc-3.14.6/intel-14.0.2-openmpi-1.6.5/lib/libpetsc.so: undefined reference to `MPI_Iallreduce' Thanks again for all your help, Danyang From: Junchao Zhang Date: Sunday, April 11, 2021 at 7:54 AM To: Danyang Su Cc: Barry Smith , "

Re: [petsc-users] Undefined reference in PETSc 3.13+ with old MPI version

2021-04-10 Thread Danyang Su
2.1.6 version and it seems working properly. Thanks and have a good rest of the weekend, Danyang From: Danyang Su Date: Saturday, April 10, 2021 at 4:08 PM To: Junchao Zhang Cc: Barry Smith , "petsc-users@mcs.anl.gov" Subject: Re: [petsc-users] Undefined reference in

Re: [petsc-users] Undefined reference in PETSc 3.13+ with old MPI version

2021-04-10 Thread Danyang Su
]: *** [ex5f] Error 1 Thanks, Danyang From: Junchao Zhang Date: Saturday, April 10, 2021 at 3:57 PM To: Danyang Su Cc: Barry Smith , "petsc-users@mcs.anl.gov" Subject: Re: [petsc-users] Undefined reference in PETSc 3.13+ with old MPI version You sent a wrong one. This con

Re: [petsc-users] Undefined reference in PETSc 3.13+ with old MPI version

2021-04-10 Thread Danyang Su
Hi Junchao, Thanks. I will test this branch and get back to you later. All the best, Danyang From: Junchao Zhang Date: Saturday, April 10, 2021 at 3:32 PM To: Danyang Su Cc: Barry Smith , "petsc-users@mcs.anl.gov" Subject: Re: [petsc-users] Undefined reference in

Re: [petsc-users] Undefined reference in PETSc 3.13+ with old MPI version

2021-04-10 Thread Danyang Su
============ Thanks, Danyang From: Barry Smith Date: Saturday, April 10, 2021 at 10:31 AM To: Danyang Su Cc: "petsc-users@mcs.anl.gov" Subject: Re: [petsc-users] Undefined reference in PETSc 3.13+ with old

[petsc-users] Undefined reference in PETSc 3.13+ with old MPI version

2021-04-10 Thread Danyang Su
Dear PETSc developers and users, I am trying to install the latest PETSc version on an ancient cluster. The OpenMPI version is 1.6.5 and Compiler is Intel 14.0, which are the newest on that cluster. I have no problem to install PETSc up to version 3.12.5. However, if I try to use PETSc

Re: [petsc-users] Quite different behaviours of PETSc solver on different clusters

2020-10-29 Thread Danyang Su
Hi Matt, No, interations from both linear and nonlinear solvers are similar. The system administrator doubt that the latency in mpich makes the difference. We will test a petsc version with OpenMPI on that cluster to check if it makes difference. Thanks, Danyang On October 29, 2020 6:05:53

Re: [petsc-users] Parallel writing in HDF5-1.12.0 when some processors have no data to write

2020-06-14 Thread Danyang Su
: Can you reproduce in C? You're missing three extra arguments that exist in the Fortran interface. https://support.hdfgroup.org/HDF5/doc/RM/RM_H5D.html#Dataset-Create Danyang Su writes: > Hi Jed, > > Attached is the example for your test. >

Re: [petsc-users] Parallel writing in HDF5-1.12.0 when some processors have no data to write

2020-06-13 Thread Danyang Su
, "Jed Brown" wrote: Can you reproduce in C? You're missing three extra arguments that exist in the Fortran interface. https://support.hdfgroup.org/HDF5/doc/RM/RM_H5D.html#Dataset-Create Danyang Su writes: > Hi Jed, > > Attached is the ex

Re: [petsc-users] Parallel writing in HDF5-1.12.0 when some processors have no data to write

2020-06-12 Thread Danyang Su
; Works fine Centos-7 + Intel2018 + HDF5-12.0 -> Works fine Possible error when code crashes At line 6686 of file H5_gen.F90 Fortran runtime error: Index '1' of dimension 1 of array 'buf' above upper bound of 0 Thanks, Danyang On 2020-06-12, 6:05 AM, "Jed Brown" wrote

Re: [petsc-users] Parallel writing in HDF5-1.12.0 when some processors have no data to write

2020-06-12 Thread Danyang Su
pace,& mem_space_id=memspace, xfer_prp = xlist_id) Please let me know if there is something wrong in the code that causes the problem. Thanks, Danyang On 2020-06-11, 8:32 PM, "Jed Brown" wrote: Danyang Su writes: > Hi Barry, > > The HDF5 calls fail. I reconf

Re: [petsc-users] Parallel writing in HDF5-1.12.0 when some processors have no data to write

2020-06-11 Thread Danyang Su
g HDF5 calls that fail or is it PETSc routines calling >HDF5 that fail? > >Regardless it sounds like the easiest fix is to switch back to the >previous HDF5 and wait for HDF5 to fix what sounds to be a bug. > > Barry > > >> On Jun 11, 2020, at 1:05 AM, Danyang Su

[petsc-users] FW: Parallel writing in HDF5-1.12.0 when some processors have no data to write

2020-06-11 Thread Danyang Su
Hi All, Sorry to send the previous incomplete email accidentally. After updating to HDF5-1.12.0, I got some problem if some processors have no data to write or not necessary to write. Since parallel writing is collective, I cannot disable those processors from writing. For the old

[petsc-users] Parallel writing in HDF5-1.12.0 when some processors have no data to write

2020-06-11 Thread Danyang Su
Hi All, After updating to HDF5-1.12.0, I got some problem if some processors have no data to write or not necessary to write. Since parallel writing is collective, I cannot disable those processors from writing. For the old version, there seems no such problem. So far, the problem only

Re: [petsc-users] Bug in ex14f.F90 when debug flags are used?

2020-06-05 Thread Danyang Su
xed to use VecGetArrayF90].. Satish On Fri, 5 Jun 2020, Danyang Su wrote: > Hi All, > > > > I have a question regarding the following example. > > https://www.mcs.anl.gov/petsc/petsc-current/src/k

[petsc-users] Bug in ex14f.F90 when debug flags are used?

2020-06-05 Thread Danyang Su
Hi All, I have a question regarding the following example. https://www.mcs.anl.gov/petsc/petsc-current/src/ksp/ksp/tutorials/ex14f.F90.html When debug flags are used in the make file, the code crashed with following error. At line 335 of file ex14f.F90 Fortran runtime error:

Re: [petsc-users] Agglomeration for Multigrid on Unstructured Meshes

2020-06-01 Thread Danyang Su
al problem or just on the "coarse" problem of an initial geometric hierarchy. Danyang Su writes: > Dear All, > > > > I recalled there was a presentation ‘Extreme-scale multigrid components with PETSc’ taling about agglomeration in p

[petsc-users] Agglomeration for Multigrid on Unstructured Meshes

2020-06-01 Thread Danyang Su
Dear All, I recalled there was a presentation ‘Extreme-scale multigrid components with PETSc’ taling about agglomeration in parallel multigrid, with future plan to extend to support unstructured meshes. Is this under development or to be added? Thanks and regards, Danyang

Re: [petsc-users] Domain decomposition using DMPLEX

2019-11-26 Thread Danyang Su
On 2019-11-26 10:18 a.m., Matthew Knepley wrote: On Tue, Nov 26, 2019 at 11:43 AM Danyang Su <mailto:danyang...@gmail.com>> wrote: On 2019-11-25 7:54 p.m., Matthew Knepley wrote: On Mon, Nov 25, 2019 at 6:25 PM Swarnava Ghosh mailto:swarnav...@gmail.com>> wrote:

Re: [petsc-users] Domain decomposition using DMPLEX

2019-11-26 Thread Danyang Su
On 2019-11-25 7:54 p.m., Matthew Knepley wrote: On Mon, Nov 25, 2019 at 6:25 PM Swarnava Ghosh > wrote: Dear PETSc users and developers, I am working with dmplex to distribute a 3D unstructured mesh made of tetrahedrons in a cuboidal domain. I had a few

Re: [petsc-users] DMPlex memory problem in scaling test

2019-10-10 Thread Danyang Su via petsc-users
tine B   !c get local mesh DM and set coordinates   call DMGetCoordinatesLocal(dmda_flow%da,gc,ierr)   CHKERRQ(ierr)   call DMGetCoordinateDM(dmda_flow%da,cda,ierr)   CHKERRQ(ierr) Thanks, Danyang On 2019-10-10 6:15 p.m., Matthew Knepley wrote: On Thu, Oct 10, 2019 at 9:

Re: [petsc-users] Makefile change for PETSc3.12.0???

2019-10-02 Thread Danyang Su via petsc-users
On 2019-10-02 11:00 a.m., Balay, Satish wrote: Can you retry with this fix: https://gitlab.com/petsc/petsc/commit/3ae65d51d08dba2e118033664acfd64a46c9bf1d [You can use maint branch for it] Satish This works. Thanks. Danyang On Wed, 2 Oct 2019, Danyang Su via petsc-users wrote: Dear

Re: [petsc-users] Error running configure on SOWING

2019-09-10 Thread Danyang Su via petsc-users
= something On Sep 10, 2019, at 1:11 PM, Danyang Su wrote: Sorry I forgot to attached the log file. Attached are the log files using the following configuration: ./configure COPTFLAGS="-march=native -O2" CXXOPTFLAGS="-march=native -O2" FOPTFLAGS="-march=native -O2" --with

[petsc-users] Error in creating compressed data using HDF5

2019-09-02 Thread Danyang Su via petsc-users
Dear All, Not sure if this is the right place to ask hdf5 question. I installed hdf5 through PETSc configuration --download-hdf5=yes. The code runs without problem except the function to create compressed data (red part shown below).     !c create local memory space and hyperslab     call

Re: [petsc-users] different dof in DMDA creating

2019-08-15 Thread Danyang Su via petsc-users
and regards, Danyang On 2019-02-07 1:53 p.m., Danyang Su wrote: Thanks, Barry. DMPlex also works for my code. Danyang On 2019-02-07 1:14 p.m., Smith, Barry F. wrote:    No, you would need to use the more flexible DMPlex On Feb 7, 2019, at 3:04 PM, Danyang Su via petsc-users wrote: Dear

[petsc-users] Strange compiling error in DMPlexDistribute after updating PETSc to V3.11.0

2019-04-05 Thread Danyang Su via petsc-users
Hi All, I got a strange error in calling DMPlexDistribute after updating PETSc to V3.11.0. There sounds no change in the interface of DMPlexDistribute as documented in https://www.mcs.anl.gov/petsc/petsc-3.10/docs/manualpages/DMPLEX/DMPlexDistribute.html#DMPlexDistribute

Re: [petsc-users] Installation error on macOS Mojave using GNU compiler

2019-01-04 Thread Danyang Su via petsc-users
On 2019-01-03, 4:59 PM, "Balay, Satish" wrote: On Thu, 3 Jan 2019, Matthew Knepley via petsc-users wrote: > On Thu, Jan 3, 2019 at 7:02 PM Danyang Su via petsc-users < > petsc-users@mcs.anl.gov> wrote: > > > Hi All, > >

Re: [petsc-users] [petsc-maint] Fwd: DMPlex global to natural problem using DmPlexGetVertexNumbering or DMPlexGlobalToNatural

2018-11-29 Thread Danyang Su via petsc-users
On 18-11-29 06:13 PM, Matthew Knepley wrote: On Thu, Nov 29, 2018 at 7:40 PM Danyang Su via petsc-maint mailto:petsc-ma...@mcs.anl.gov>> wrote: Dear PETSc developers & users, Sorry to bother you again. I just encounter some difficulties in DMPlex global to nat

Re: [petsc-users] Makefile for mixed C++ and Fortran code

2018-06-03 Thread Danyang Su
. [with correct PETSC_DIR and PETSC_ARCH values] If you have issues - send the complete makefiles - and complete error log.. On Sat, 2 Jun 2018, Danyang Su wrote: Hi Barry, For the code without PETSc, the rules used to compile the code with CGAL is note: DLIB can probably be simlified

Re: [petsc-users] Makefile for mixed C++ and Fortran code

2018-06-02 Thread Danyang Su
r your compile that works and make sure those same flags are used in "PETSc version" of the makefile. You could add the flags directly to the rule %.o:%.cpp $(CLINKER) $(CXXFLAGS) $(CPPFLAGS) -c -frounding-math $< -o $@ Barry On Jun 1, 2018, at 12:37 PM, Danyang Su

Re: [petsc-users] Makefile for mixed C++ and Fortran code

2018-06-01 Thread Danyang Su
, Barry. Still not work after including ${DLIB} On Jun 1, 2018, at 12:07 PM, Danyang Su wrote: Hi All, My code needs to link to an external C++ library (CGAL). The code is written in Fortran and I have already written interface to let Fortran call C++ function. For the sequential version

Re: [petsc-users] Makefile for mixed C++ and Fortran code

2018-06-01 Thread Danyang Su
::open(std::string&)’ out.open(strfile); Thanks, Danyang On 18-06-01 10:07 AM, Danyang Su wrote: Hi All, My code needs to link to an external C++ library (CGAL). The code is written in Fortran and I have already written interface to let Fortran call C++ function. For the sequenti

[petsc-users] Makefile for mixed C++ and Fortran code

2018-06-01 Thread Danyang Su
Hi All, My code needs to link to an external C++ library (CGAL). The code is written in Fortran and I have already written interface to let Fortran call C++ function. For the sequential version without PETSc, it can be compiled without problem using the following makefile. The parallel

Re: [petsc-users] Segmentation Violation in getting DMPlex coordinates

2018-04-28 Thread Danyang Su
, 2018, at 9:19 AM, Matthew Knepley <knep...@gmail.com> wrote: On Sat, Apr 28, 2018 at 2:08 AM, Danyang Su <danyang...@gmail.com> wrote: Hi All, I use DMPlex and need to get coordinates back after distribution. However, I always get segmentation violation in getting coords values in t

[petsc-users] Segmentation Violation in getting DMPlex coordinates

2018-04-28 Thread Danyang Su
Hi All, I use DMPlex and need to get coordinates back after distribution. However, I always get segmentation violation in getting coords values in the following codes if using multiple processors. If only one processor is used, it works fine. For each processors, the off value starts from 0

Re: [petsc-users] Get vertex index of each cell in DMPlex after distribution

2018-04-27 Thread Danyang Su
On 2018-04-27 04:11 AM, Matthew Knepley wrote: On Fri, Apr 27, 2018 at 2:09 AM, Danyang Su <danyang...@gmail.com <mailto:danyang...@gmail.com>> wrote: Hi Matt, Sorry if this is a stupid question. In the previous code for unstructured grid, I create labels to mark

[petsc-users] Get vertex index of each cell in DMPlex after distribution

2018-04-27 Thread Danyang Su
? Would you please give me a hint or functions that I can use. Thanks, Danyang On 18-04-25 02:12 PM, Danyang Su wrote: On 2018-04-25 09:47 AM, Matthew Knepley wrote: On Wed, Apr 25, 2018 at 12:40 PM, Danyang Su <danyang...@gmail.com <mailto:danyang...@gmail.com>> wrote:

Re: [petsc-users] DMSetLabelValue takes a lot of time for large domain

2018-04-25 Thread Danyang Su
On 2018-04-25 09:47 AM, Matthew Knepley wrote: On Wed, Apr 25, 2018 at 12:40 PM, Danyang Su <danyang...@gmail.com <mailto:danyang...@gmail.com>> wrote: Hi Matthew, In the worst case, every node/cell may have different label. Do not use Label for this. Its not an appro

Re: [petsc-users] DMSetLabelValue takes a lot of time for large domain

2018-04-25 Thread Danyang Su
  write(*,*) ipoint, (ipoint+1.0)/istart,"time",MPI_Wtime()     end if   end if   call DMLabelSetValue(label,ipoint,ipoint+1,ierr)   CHKERRQ(ierr)     end do Thanks, Danyang On 2018-04-25 03:16 AM, Matthew Knepley wrote: On Tue, Apr 24, 2018 at 11:57 PM,

[petsc-users] DMSetLabelValue takes a lot of time for large domain

2018-04-24 Thread Danyang Su
Hi All, I use DMPlex in unstructured grid code and recently found DMSetLabelValue takes a lot of time for large problem, e.g., num. of cells > 1 million. In my code, I use DMPlexCreateFromCellList () Loop over all cells/nodes{ DMSetLabelValue } DMPlexDistribute The code works fine

Re: [petsc-users] [petsc-maint] how to check if cell is local owned in DMPlex

2018-03-08 Thread Danyang Su
On 18-03-07 03:54 PM, Jed Brown wrote: Danyang Su <danyang...@gmail.com> writes: Based on my test, this function works fine using current PETSc-dev version, but I cannot get it compiled correctly using other versions for Fortran code, as mentioned in the previous emails. I asked this qu

Re: [petsc-users] [petsc-maint] how to check if cell is local owned in DMPlex

2018-03-07 Thread Danyang Su
. Thanks, Danyang On 18-03-05 11:50 AM, Smith, Barry F. wrote: MatSolverPackage became MatSolverType On Mar 5, 2018, at 1:35 PM, Danyang Su <danyang...@gmail.com> wrote: Hi Barry and Matt, The compiling problem should be caused by the PETSc version installed on my computer. When u

Re: [petsc-users] [petsc-maint] how to check if cell is local owned in DMPlex

2018-03-05 Thread Danyang Su
will show every use of the function in the source code. Very useful tool Barry On Mar 4, 2018, at 1:05 PM, Danyang Su <danyang...@gmail.com> wrote: On 18-03-04 08:08 AM, Matthew Knepley wrote: On Fri, Mar 2, 2018 at 3:22 PM, Danyang Su <danyang...@gmail.com> wrote: Hi Matt, I us

Re: [petsc-users] how to check if cell is local owned in DMPlex

2018-03-04 Thread Danyang Su
On 18-03-04 08:08 AM, Matthew Knepley wrote: On Fri, Mar 2, 2018 at 3:22 PM, Danyang Su <danyang...@gmail.com <mailto:danyang...@gmail.com>> wrote: Hi Matt, I use the latest Fortran style in PETSc 3.8. Enclosed are the PETSc configuration, code compiling log and

Re: [petsc-users] how to check if cell is local owned in DMPlex

2018-03-02 Thread Danyang Su
, Danyang Su <danyang...@gmail.com <mailto:danyang...@gmail.com>> wrote: On 18-03-02 10:58 AM, Matthew Knepley wrote: On Fri, Mar 2, 2018 at 1:41 PM, Danyang Su <danyang...@gmail.com <mailto:danyang...@gmail.com>> wrote: On 18-02-19 03:30 P

Re: [petsc-users] how to check if cell is local owned in DMPlex

2018-03-02 Thread Danyang Su
On 18-03-02 10:58 AM, Matthew Knepley wrote: On Fri, Mar 2, 2018 at 1:41 PM, Danyang Su <danyang...@gmail.com <mailto:danyang...@gmail.com>> wrote: On 18-02-19 03:30 PM, Matthew Knepley wrote: On Mon, Feb 19, 2018 at 3:11 PM, Danyang Su <danyang...@gmail.com

Re: [petsc-users] how to check if cell is local owned in DMPlex

2018-03-02 Thread Danyang Su
On 18-03-02 10:58 AM, Matthew Knepley wrote: On Fri, Mar 2, 2018 at 1:41 PM, Danyang Su <danyang...@gmail.com <mailto:danyang...@gmail.com>> wrote: On 18-02-19 03:30 PM, Matthew Knepley wrote: On Mon, Feb 19, 2018 at 3:11 PM, Danyang Su <danyang...@gmail.com

Re: [petsc-users] object name overwritten in VecView

2018-02-28 Thread Danyang Su
all in $PETSC_DIR > On Feb 28, 2018, at 9:07 AM, Danyang Su <danyang...@gmail.com> wrote: > > Hi Matt, > > Thanks for your suggestion and I will use xmf instead. > > Regards, > > Danyang > > On February 28, 2018 3:58:08 AM PST, Matthew Knepley <knep...@g

Re: [petsc-users] object name overwritten in VecView

2018-02-28 Thread Danyang Su
t; -dm_view hdf5:test.h5 -u_vec_view hdf5:test.h5::append -v_vec_view >hdf5:test.h5::append > >which produces a file > > test.h5 > >Then I run > > $PETSC_DIR/bin/petsc_gen_xdmf.py test.h5 > >which produces another file > > test.xmf > >This

[petsc-users] object name overwritten in VecView

2018-02-27 Thread Danyang Su
Hi All, How to set different object names when using multiple VecView? I try to use PetscObjectSetName with multiple output, but the object name is overwritten by the last one. As shown below, as well as the enclosed files as example, the vector name in sol.vtk is vec_v for both vector u

Re: [petsc-users] Cell type for DMPlexCreateFromCellList

2018-02-23 Thread Danyang Su
On 18-02-23 03:04 AM, Matthew Knepley wrote: On Fri, Feb 23, 2018 at 1:33 AM, Danyang Su <danyang...@gmail.com <mailto:danyang...@gmail.com>> wrote: Hi All, What cell types does DMPlexCreateFromCellList support? I test this with triangle, tetrahedron and prism. B

[petsc-users] Cell type for DMPlexCreateFromCellList

2018-02-22 Thread Danyang Su
Hi All, What cell types does DMPlexCreateFromCellList support? I test this with triangle, tetrahedron and prism. Both triangle and tetrahedron work but prism mesh throws error saying "Cone size 6 not supported for dimension 3". Could anyone tell me all the supported cell types? Thanks,

Re: [petsc-users] Question on DMPlexCreateSection for Fortran

2018-02-22 Thread Danyang Su
RQ(ierr) Thanks, Danyang On 18-02-21 09:22 AM, Danyang Su wrote: Hi Matt, To test the Segmentation Violation problem in my code, I modified the example ex1f90.F to reproduce the problem I have in my own code. If use DMPlexCreateBoxMesh to generate the mesh, the code works fine. H

Re: [petsc-users] Question on DMPlexCreateSection for Fortran

2018-02-21 Thread Danyang Su
lem. My code is done in a similar way as ex1f90, it reads mesh from external file or creates from cell list, distributes the mesh (these already work), and then creates sections and sets ndof to the nodes. Thanks, Danyang On 18-02-20 10:07 AM, Danyang Su wrote: On 18-02-20 09:52 AM, Matthew Knepley w

Re: [petsc-users] Question on DMPlexCreateSection for Fortran

2018-02-20 Thread Danyang Su
On 18-02-20 09:52 AM, Matthew Knepley wrote: On Tue, Feb 20, 2018 at 12:30 PM, Danyang Su <danyang...@gmail.com <mailto:danyang...@gmail.com>> wrote: Hi All, I tried to compile the DMPlexCreateSection code but got error information as shown below. Error: Symbol 'p

[petsc-users] Question on DMPlexCreateSection for Fortran

2018-02-20 Thread Danyang Su
Hi All, I tried to compile the DMPlexCreateSection code but got error information as shown below. Error: Symbol 'petsc_null_is' at (1) has no IMPLICIT type I tried to use PETSC_NULL_OBJECT instead of PETSC_NULL_IS, then the code can be compiled but run into Segmentation Violation error in

[petsc-users] how to check if cell is local owned in DMPlex

2018-02-19 Thread Danyang Su
Hi Matt, Would you please let me know how to check if a cell is local owned? When overlap is 0 in DMPlexDistribute, all the cells are local owned. How about overlap > 0? It sounds like impossible to check by node because a cell can be local owned even if none of the nodes in this cell is

Re: [petsc-users] Error when use DMPlexGetVertexNumbering

2018-02-16 Thread Danyang Su
On 18-02-16 10:50 AM, Matthew Knepley wrote: On Fri, Feb 16, 2018 at 1:45 PM, Danyang Su <danyang...@gmail.com <mailto:danyang...@gmail.com>> wrote: Hi Matt, I try to get the global vertex index and cell index from local mesh and run into problem. What I need is loc

[petsc-users] Error when use DMPlexGetVertexNumbering

2018-02-16 Thread Danyang Su
Hi Matt, I try to get the global vertex index and cell index from local mesh and run into problem. What I need is local to global index (the original index used in DMPlexCreateFromCellList is best, as user know exactly where the node/cell is) for vertices and cells, which will be used to

Re: [petsc-users] Question on DMPlexCreateFromCellList and DMPlexCreateFromFile

2018-02-16 Thread Danyang Su
On 18-02-16 10:13 AM, Matthew Knepley wrote: On Fri, Feb 16, 2018 at 11:36 AM, Danyang Su <danyang...@gmail.com <mailto:danyang...@gmail.com>> wrote: On 18-02-15 05:57 PM, Matthew Knepley wrote: On Thu, Feb 15, 2018 at 7:40 PM, Danyang Su <danyang...@gmail.com

Re: [petsc-users] Question on DMPlexCreateFromCellList and DMPlexCreateFromFile

2018-02-16 Thread Danyang Su
On 18-02-15 05:57 PM, Matthew Knepley wrote: On Thu, Feb 15, 2018 at 7:40 PM, Danyang Su <danyang...@gmail.com <mailto:danyang...@gmail.com>> wrote: Hi Matt, I have a question on DMPlexCreateFromCellList and DMPlexCreateFromFile. When use DMPlexCreateFromFile wi

Re: [petsc-users] Is OpenMP still available for PETSc?

2017-06-30 Thread Danyang Su
PM, Danyang Su <danyang...@gmail.com> wrote: Dear All, I recalled there was OpenMP available for PETSc for the old development version. When google "petsc hybrid mpi openmp", there returned some papers about this feature. My code was first parallelized using OpenMP and then r

Re: [petsc-users] PCFactorSetShiftType does not work in code but -pc_factor_set_shift_type works

2017-05-25 Thread Danyang Su
Remove your option '-vecload_block_size 10'. Hong On Wed, May 24, 2017 at 3:06 PM, Danyang Su <danyang...@gmail.com <mailto:danyang...@gmail.com>> wrote: Dear Hong, I just tested with different number of processors for the same matrix. It sometimes

[petsc-users] PCFactorSetShiftType does not work in code but -pc_factor_set_shift_type works

2017-05-25 Thread Danyang Su
ove your option '-vecload_block_size 10'. Hong On Wed, May 24, 2017 at 3:06 PM, Danyang Su <danyang...@gmail.com <mailto:danyang...@gmail.com>> wrote: Dear Hong, I just tested with different number of processors for the same matrix. It sometimes got "ERROR: Argumen

Re: [petsc-users] Question on incomplete factorization level and fill

2017-05-24 Thread Danyang Su
Hi All, I just delete the .info file and it works without problem now. Thanks, Danyang On 17-05-24 06:32 PM, Hong wrote: Remove your option '-vecload_block_size 10'. Hong On Wed, May 24, 2017 at 3:06 PM, Danyang Su <danyang...@gmail.com <mailto:danyang...@gmail.com>> wrote

Re: [petsc-users] Question on incomplete factorization level and fill

2017-05-24 Thread Danyang Su
16 or 48 processors. The error information is attached. I tested this on my local computer with 6 cores 12 threads. Any suggestion on this? Thanks, Danyang On 17-05-24 12:28 PM, Danyang Su wrote: Hi Hong, Awesome. Thanks for testing the case. I will try your options for the code and g

Re: [petsc-users] Question on incomplete factorization level and fill

2017-05-24 Thread Danyang Su
. Thanks, Danyang On 17-05-24 11:12 AM, Matthew Knepley wrote: On Wed, May 24, 2017 at 12:50 PM, Danyang Su <danyang...@gmail.com <mailto:danyang...@gmail.com>> wrote: Hi Matthew and Barry, Thanks for the quick response. I also tried super

Re: [petsc-users] Question on incomplete factorization level and fill

2017-05-24 Thread Danyang Su
with pretty good speedup. And I am not sure if I miss something for this problem. Thanks, Danyang On 17-05-24 11:12 AM, Matthew Knepley wrote: On Wed, May 24, 2017 at 12:50 PM, Danyang Su <danyang...@gmail.com <mailto:danyang...@gmail.com>> wrote: Hi Matthew and Barry,

Re: [petsc-users] Question on incomplete factorization level and fill

2017-05-24 Thread Danyang Su
the factorization level through hypre? Thanks, Danyang On 17-05-24 04:59 AM, Matthew Knepley wrote: On Wed, May 24, 2017 at 2:21 AM, Danyang Su <danyang...@gmail.com <mailto:danyang...@gmail.com>> wrote: Dear All, I use PCFactorSetLevels for ILU and PCFactorSetFi

[petsc-users] Question on incomplete factorization level and fill

2017-05-24 Thread Danyang Su
Dear All, I use PCFactorSetLevels for ILU and PCFactorSetFill for other preconditioning in my code to help solve the problems that the default option is hard to solve. However, I found the latter one, PCFactorSetFill does not take effect for my problem. The matrices and rhs as well as the

Re: [petsc-users] SuperLU convergence problem (More test)

2015-12-08 Thread Danyang Su
Hi Hong, Sorry to bother you again. The modified code works much better than before using both superlu or mumps. However, it still encounters failure. The case is similar with the previous one, ill-conditioned matrices. The code crashed after a long time simulation if I use superlu_dist,

Re: [petsc-users] SuperLU convergence problem (More test)

2015-12-08 Thread Danyang Su
Hi Hong, Thanks for checking this. A mechanical model was added at the time when the solver failed, causing some problem. We need to improve this part in the code. Thanks again and best wishes, Danyang On 15-12-08 08:10 PM, Hong wrote: Danyang : Your matrices are ill-conditioned,

Re: [petsc-users] SuperLU convergence problem (More test)

2015-12-07 Thread Danyang Su
Hello Hong, Thanks for the quick reply and the option "-mat_superlu_dist_fact SamePattern" works like a charm, if I use this option from the command line. How can I add this option as the default. I tried using PetscOptionsInsertString("-mat_superlu_dist_fact SamePattern",ierr) in my code

Re: [petsc-users] SuperLU convergence problem (More test)

2015-12-07 Thread Danyang Su
Thank. The inserted options works now. I didn't put PetscOptionsInsertString in the right place before. Danyang On 15-12-07 12:01 PM, Hong wrote: Danyang: Add 'call MatSetFromOptions(A,ierr)' to your code. Attached below is ex52f.F modified from your ex52f.F to be compatible with petsc-dev.

Re: [petsc-users] SuperLU convergence problem (More test)

2015-12-05 Thread Danyang Su
Hi Hong, I did more test today and finally found that the solution accuracy depends on the initial (first) matrix quality. I modified the ex52f.F to do the test. There are 6 matrices and right-hand-side vectors. All these matrices and rhs are from my reactive transport simulation. Results

Re: [petsc-users] SuperLU convergence problem

2015-12-03 Thread Danyang Su
Hi Hong, I just checked using ex10 for these matrices and rhs, they all work fine. I found something is wrong in my code when using direct solver. The second parameter mat in PCFactorGetMatrix(PC pc,Mat *mat) is not initialized in my code for SUPERLU or MUMPS. I will fix this bug, rerun

Re: [petsc-users] SuperLU convergence problem

2015-12-03 Thread Danyang Su
Hi Hong, The binary format of matrix, rhs and solution can be downloaded via the link below. https://www.dropbox.com/s/cl3gfi0s0kjlktf/matrix_and_rhs_bin.tar.gz?dl=0 Thanks, Danyang On 15-12-03 10:50 AM, Hong wrote: Danyang: To my surprising, solutions from SuperLU at timestep 29

Re: [petsc-users] Error reported by MUMPS in numerical factorization phase

2015-12-02 Thread Danyang Su
Hi Hong, It's not easy to run in debugging mode as the cluster does not have petsc installed using debug mode. Restart the case from the crashing time does not has the problem. So if I want to detect this error, I need to start the simulation from beginning which takes hours in the cluster.

Re: [petsc-users] Error reported by MUMPS in numerical factorization phase

2015-12-02 Thread Danyang Su
Hi Hong, Thank. I can test it but it may takes some time to install petsc-dev on the cluster. I will try more cases to see if I can get this error on my local machine which is much more convenient for me to test in debug mode. So far, the error does not occur on my local machine using the

[petsc-users] Error reported by MUMPS in numerical factorization phase

2015-12-01 Thread Danyang Su
Hi All, My code fails due to the error in external library. It works fine for the previous 2000+ timesteps but then crashes. [4]PETSC ERROR: Error in external library [4]PETSC ERROR: Error reported by MUMPS in numerical factorization phase: INFO(1)=-1, INFO(2)=0 The full error message is

Re: [petsc-users] Error after updating to 3.6.0: finclude/petscsys.h: No such file or directory

2015-06-14 Thread Danyang Su
of PETSc and for packaging systems On 15-06-14 09:15 PM, Danyang Su wrote: Hi PETSc User, I get problem in compiling my codes after updating PETSc to 3.6.0. The codes work fine using PETSc 3.5.3 and PETSc-dev. I have made modified include lines in makefile from #PETSc variables for V3.5.3

Re: [petsc-users] Is matrix analysis available in PETSc or external package?

2015-05-12 Thread Danyang Su
On 15-05-11 07:19 PM, Hong wrote: Danyang: I recently have some time-dependent cases that have difficulty in convergence. It needs a lot of linear iterations during a specific time, e.g., more than 100 linear iterations for every newton iteration. In PETSc parallel version,

Re: [petsc-users] Is matrix analysis available in PETSc or external package?

2015-05-12 Thread Danyang Su
On 15-05-12 11:13 AM, Barry Smith wrote: On May 11, 2015, at 7:10 PM, Danyang Su danyang...@gmail.com wrote: Hi All, I recently have some time-dependent cases that have difficulty in convergence. It needs a lot of linear iterations during a specific time, e.g., more than 100 linear

[petsc-users] Is matrix analysis available in PETSc or external package?

2015-05-11 Thread Danyang Su
Hi All, I recently have some time-dependent cases that have difficulty in convergence. It needs a lot of linear iterations during a specific time, e.g., more than 100 linear iterations for every newton iteration. In PETSc parallel version, this number will be doubled or even more. Our case

  1   2   >