Re: [petsc-users] dmplex normal vector incorrect for periodic gmsh grids

2022-12-13 Thread Praveen C
Thank you, this MR works if I generate the mesh within the code.

But using a mesh made in gmsh, I see the same issue.

Thanks
praveen

> On 14-Dec-2022, at 12:51 AM, Matthew Knepley  wrote:
> 
> On Tue, Dec 13, 2022 at 10:57 AM Matthew Knepley  > wrote:
>> On Tue, Dec 13, 2022 at 6:11 AM Praveen C > > wrote:
>>> Hello
>>> 
>>> In the attached test, I read a small grid made in gmsh with periodic bc.
>>> 
>>> This is a 2d mesh.
>>> 
>>> The cell numbers are shown in the figure.
>>> 
>>> All faces have length = 2.5
>>> 
>>> But using PetscFVFaceGeom I am getting length of 7.5 for some faces. E.g.,
>>> 
>>> face: 59, centroid = 3.75, 2.50, normal = 0.00, -7.50
>>> ===> Face length incorrect = 7.50, should be 2.5
>>> support[0] = 11, cent = 8.75, 3.75, area = 6.25
>>> support[1] = 15, cent = 8.75, 1.25, area = 6.25
>>> 
>>> There are also errors in the orientation of normal.
>>> 
>>> If we disable periodicity in geo file, this error goes away.
>> 
>> Yes, by default we only localize coordinates for cells. I can put in code to 
>> localize faces.
> 
> Okay, I now have a MR for this: 
> https://gitlab.com/petsc/petsc/-/merge_requests/5917
> 
> I am attaching your code, slightly modified. You can run
> 
>   ./dmplex -malloc_debug 0 -dm_plex_box_upper 10,10 -dm_plex_box_faces 4,4 
> -dm_plex_simplex 0 -dm_view ::ascii_info_detail -draw_pause 3 -dm_plex_box_bd 
> periodic,periodic -dm_localize_height 0
> 
> which shows incorrect edges and
> 
> ./dmplex -malloc_debug 0 -dm_plex_box_upper 10,10 -dm_plex_box_faces 4,4 
> -dm_plex_simplex 0 -dm_view ::ascii_info_detail -draw_pause 3 -dm_plex_box_bd 
> periodic,periodic -dm_localize_height 1
> 
> which is correct. If you want to control things yourself, instead of using 
> the option you can call DMPlexSetMaxProjectionHeight() on the coordinate DM 
> yourself.
> 
>   Thanks,
> 
>  Matt
>  
>>   Thanks,
>> 
>> Matt
>>  
>>> Thanks
>>> praveen
>> -- 
>> 



Re: [petsc-users] GAMG and linearized elasticity

2022-12-13 Thread Jed Brown
Do you have slip/symmetry boundary conditions, where some components are 
constrained? In that case, there is no uniform block size and I think you'll 
need DMPlexCreateRigidBody() and MatSetNearNullSpace().

The PCSetCoordinates() code won't work for non-constant block size.

-pc_type gamg should work okay out of the box for elasticity. For hypre, I've 
had good luck with this options suite, which also runs on GPU.

-pc_type hypre -pc_hypre_boomeramg_coarsen_type pmis 
-pc_hypre_boomeramg_interp_type ext+i -pc_hypre_boomeramg_no_CF 
-pc_hypre_boomeramg_P_max 6 -pc_hypre_boomeramg_relax_type_down Chebyshev 
-pc_hypre_boomeramg_relax_type_up Chebyshev 
-pc_hypre_boomeramg_strong_threshold 0.5

Blaise Bourdin  writes:

> Hi,
>
> I am getting close to finish porting a code from petsc 3.3 / sieve to main / 
> dmplex, but am
> now encountering difficulties 
> I am reasonably sure that the Jacobian and residual are correct. The codes 
> handle boundary
> conditions differently (MatZeroRowCols vs dmplex constraints) so it is not 
> trivial to compare
> them. Running with snes_type ksponly pc_type Jacobi or hyper gives me the 
> same results in
> roughly the same number of iterations.
>
> In my old code, gamg would work out of the box. When using petsc-main, 
> -pc_type gamg -
> pc_gamg_type agg works for _some_ problems using P1-Lagrange elements, but 
> never for
> P2-Lagrange. The typical error message is in gamg_agg.txt
>
> When using -pc_type classical, a problem where the KSP would converge in 47 
> iteration in
> 3.3 now takes 1400.  ksp_view_3.3.txt and ksp_view_main.txt show the output 
> of -ksp_view
> for both versions. I don’t notice anything obvious.
>
> Strangely, removing the call to PCSetCoordinates does not have any impact on 
> the
> convergence.
>
> I am sure that I am missing something, or not passing the right options. 
> What’s a good
> starting point for 3D elasticity?
> Regards,
> Blaise
>
> — 
> Canada Research Chair in Mathematical and Computational Aspects of Solid 
> Mechanics
> (Tier 1)
> Professor, Department of Mathematics & Statistics
> Hamilton Hall room 409A, McMaster University
> 1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada 
> https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243
> [0]PETSC ERROR: - Error Message 
> --
> [0]PETSC ERROR: Petsc has generated inconsistent data
> [0]PETSC ERROR: Computed maximum singular value as zero
> [0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be 
> the program crashed before they were used or a spelling mistake, etc!
> [0]PETSC ERROR: Option left: name:-displacement_ksp_converged_reason value: 
> ascii source: file
> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
> [0]PETSC ERROR: Petsc Development GIT revision: v3.18.2-341-g16200351da0  GIT 
> Date: 2022-12-12 23:42:20 +
> [0]PETSC ERROR: 
> /home/bourdinb/Development/mef90/mef90-dmplex/bbserv-gcc11.2.1-mvapich2-2.3.7-O/bin/ThermoElasticity
>  on a bbserv-gcc11.2.1-mvapich2-2.3.7-O named bb01 by bourdinb Tue Dec 13 
> 17:02:19 2022
> [0]PETSC ERROR: Configure options --CFLAGS=-Wunused 
> --FFLAGS="-ffree-line-length-none -fallow-argument-mismatch -Wunused" 
> --COPTFLAGS="-O2 -march=znver2" --CXXOPTFLAGS="-O2 -march=znver2" 
> --FOPTFLAGS="-O2 -march=znver2" --download-chaco=1 --download-exodusii=1 
> --download-fblaslapack=1 --download-hdf5=1 --download-hypre=1 
> --download-metis=1 --download-ml=1 --download-mumps=1 --download-netcdf=1 
> --download-p4est=1 --download-parmetis=1 --download-pnetcdf=1 
> --download-scalapack=1 --download-sowing=1 
> --download-sowing-cc=/opt/rh/devtoolset-9/root/usr/bin/gcc 
> --download-sowing-cxx=/opt/rh/devtoolset-9/root/usr/bin/g++ 
> --download-sowing-cpp=/opt/rh/devtoolset-9/root/usr/bin/cpp 
> --download-sowing-cxxcpp=/opt/rh/devtoolset-9/root/usr/bin/cpp 
> --download-superlu=1 --download-triangle=1 --download-yaml=1 
> --download-zlib=1 --with-debugging=0 
> --with-mpi-dir=/opt/HPC/mvapich2/2.3.7-gcc11.2.1 --with-pic 
> --with-shared-libraries=1 --with-mpiexec=srun --with-x11=0
> [0]PETSC ERROR: #1 PCGAMGOptProlongator_AGG() at 
> /1/HPC/petsc/main/src/ksp/pc/impls/gamg/agg.c:779
> [0]PETSC ERROR: #2 PCSetUp_GAMG() at 
> /1/HPC/petsc/main/src/ksp/pc/impls/gamg/gamg.c:639
> [0]PETSC ERROR: #3 PCSetUp() at 
> /1/HPC/petsc/main/src/ksp/pc/interface/precon.c:994
> [0]PETSC ERROR: #4 KSPSetUp() at 
> /1/HPC/petsc/main/src/ksp/ksp/interface/itfunc.c:405
> [0]PETSC ERROR: #5 KSPSolve_Private() at 
> /1/HPC/petsc/main/src/ksp/ksp/interface/itfunc.c:824
> [0]PETSC ERROR: #6 KSPSolve() at 
> /1/HPC/petsc/main/src/ksp/ksp/interface/itfunc.c:1070
> [0]PETSC ERROR: #7 SNESSolve_KSPONLY() at 
> /1/HPC/petsc/main/src/snes/impls/ksponly/ksponly.c:48
> [0]PETSC ERROR: #8 SNESSolve() at 
> /1/HPC/petsc/main/src/snes/interface/snes.c:4693
> [0]PETSC ERROR: #9 
> 

[petsc-users] GAMG and linearized elasticity

2022-12-13 Thread Blaise Bourdin




Hi,

I am getting close to finish porting a code from petsc 3.3 / sieve to main / dmplex, but am now encountering difficulties

I am reasonably sure that the Jacobian and residual are correct. The codes handle boundary conditions differently (MatZeroRowCols vs dmplex constraints) so it is not trivial to compare them. Running with snes_type ksponly pc_type Jacobi or hyper gives me the
 same results in roughly the same number of iterations.

In my old code, gamg would work out of the box. When using petsc-main, -pc_type gamg -pc_gamg_type agg works for _some_ problems using P1-Lagrange elements, but never for P2-Lagrange. The typical error message is in gamg_agg.txt

When using -pc_type classical, a problem where the KSP would converge in 47 iteration in 3.3 now takes 1400.  ksp_view_3.3.txt and ksp_view_main.txt show the output of -ksp_view for both versions. I don’t notice anything obvious.

Strangely, removing the call to PCSetCoordinates does not have any impact on the convergence.

I am sure that I am missing something, or not passing the right options. What’s a good starting point for 3D elasticity?
Regards,
Blaise






— 
Canada Research Chair in Mathematical and Computational Aspects of Solid Mechanics (Tier 1)
Professor, Department of Mathematics & Statistics
Hamilton Hall room 409A, McMaster University
1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada 
https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243





[0]PETSC ERROR: - Error Message 
--
[0]PETSC ERROR: Petsc has generated inconsistent data
[0]PETSC ERROR: Computed maximum singular value as zero
[0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be 
the program crashed before they were used or a spelling mistake, etc!
[0]PETSC ERROR: Option left: name:-displacement_ksp_converged_reason value: 
ascii source: file
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.18.2-341-g16200351da0  GIT 
Date: 2022-12-12 23:42:20 +
[0]PETSC ERROR: 
/home/bourdinb/Development/mef90/mef90-dmplex/bbserv-gcc11.2.1-mvapich2-2.3.7-O/bin/ThermoElasticity
 on a bbserv-gcc11.2.1-mvapich2-2.3.7-O named bb01 by bourdinb Tue Dec 13 
17:02:19 2022
[0]PETSC ERROR: Configure options --CFLAGS=-Wunused 
--FFLAGS="-ffree-line-length-none -fallow-argument-mismatch -Wunused" 
--COPTFLAGS="-O2 -march=znver2" --CXXOPTFLAGS="-O2 -march=znver2" 
--FOPTFLAGS="-O2 -march=znver2" --download-chaco=1 --download-exodusii=1 
--download-fblaslapack=1 --download-hdf5=1 --download-hypre=1 
--download-metis=1 --download-ml=1 --download-mumps=1 --download-netcdf=1 
--download-p4est=1 --download-parmetis=1 --download-pnetcdf=1 
--download-scalapack=1 --download-sowing=1 
--download-sowing-cc=/opt/rh/devtoolset-9/root/usr/bin/gcc 
--download-sowing-cxx=/opt/rh/devtoolset-9/root/usr/bin/g++ 
--download-sowing-cpp=/opt/rh/devtoolset-9/root/usr/bin/cpp 
--download-sowing-cxxcpp=/opt/rh/devtoolset-9/root/usr/bin/cpp 
--download-superlu=1 --download-triangle=1 --download-yaml=1 --download-zlib=1 
--with-debugging=0 --with-mpi-dir=/opt/HPC/mvapich2/2.3.7-gcc11.2.1 --with-pic 
--with-shared-libraries=1 --with-mpiexec=srun --with-x11=0
[0]PETSC ERROR: #1 PCGAMGOptProlongator_AGG() at 
/1/HPC/petsc/main/src/ksp/pc/impls/gamg/agg.c:779
[0]PETSC ERROR: #2 PCSetUp_GAMG() at 
/1/HPC/petsc/main/src/ksp/pc/impls/gamg/gamg.c:639
[0]PETSC ERROR: #3 PCSetUp() at 
/1/HPC/petsc/main/src/ksp/pc/interface/precon.c:994
[0]PETSC ERROR: #4 KSPSetUp() at 
/1/HPC/petsc/main/src/ksp/ksp/interface/itfunc.c:405
[0]PETSC ERROR: #5 KSPSolve_Private() at 
/1/HPC/petsc/main/src/ksp/ksp/interface/itfunc.c:824
[0]PETSC ERROR: #6 KSPSolve() at 
/1/HPC/petsc/main/src/ksp/ksp/interface/itfunc.c:1070
[0]PETSC ERROR: #7 SNESSolve_KSPONLY() at 
/1/HPC/petsc/main/src/snes/impls/ksponly/ksponly.c:48
[0]PETSC ERROR: #8 SNESSolve() at 
/1/HPC/petsc/main/src/snes/interface/snes.c:4693
[0]PETSC ERROR: #9 
/home/bourdinb/Development/mef90/mef90-dmplex/ThermoElasticity/ThermoElasticity.F90:228
  Linear solve converged due to CONVERGED_RTOL iterations 46
KSP Object:(Disp_) 32 MPI processes
  type: cg
  maximum iterations=1
  tolerances:  relative=1e-05, absolute=1e-08, divergence=1e+10
  left preconditioning
  using nonzero initial guess
  using PRECONDITIONED norm type for convergence test
PC Object:(Disp_) 32 MPI processes
  type: gamg
MG: type is MULTIPLICATIVE, levels=4 cycles=v
  Cycles per PCApply=1
  Using Galerkin computed coarse grid matrices
  Coarse grid solver -- level ---
KSP Object:(Disp_mg_coarse_) 32 MPI processes
  type: gmres
GMRES: restart=30, using Classical (unmodified) Gram-Schmidt 
Orthogonalization with no iterative refinement
GMRES: happy breakdown tolerance 1e-30
  maximum iterations=1, initial 

Re: [petsc-users] dmplex normal vector incorrect for periodic gmsh grids

2022-12-13 Thread Matthew Knepley
On Tue, Dec 13, 2022 at 10:57 AM Matthew Knepley  wrote:

> On Tue, Dec 13, 2022 at 6:11 AM Praveen C  wrote:
>
>> Hello
>>
>> In the attached test, I read a small grid made in gmsh with periodic bc.
>>
>> This is a 2d mesh.
>>
>> The cell numbers are shown in the figure.
>>
>> All faces have length = 2.5
>>
>> But using PetscFVFaceGeom I am getting length of 7.5 for some faces.
>> E.g.,
>>
>> face: 59, centroid = 3.75, 2.50, normal = 0.00, -7.50
>>
>> ===> Face length incorrect = 7.50, should be 2.5
>>
>> support[0] = 11, cent = 8.75, 3.75, area = 6.25
>>
>> support[1] = 15, cent = 8.75, 1.25, area = 6.25
>>
>>
>> There are also errors in the orientation of normal.
>>
>> If we disable periodicity in geo file, this error goes away.
>>
>
> Yes, by default we only localize coordinates for cells. I can put in code
> to localize faces.
>

Okay, I now have a MR for this:
https://gitlab.com/petsc/petsc/-/merge_requests/5917

I am attaching your code, slightly modified. You can run

  ./dmplex -malloc_debug 0 -dm_plex_box_upper 10,10 -dm_plex_box_faces 4,4
-dm_plex_simplex 0 -dm_view ::ascii_info_detail -draw_pause 3
-dm_plex_box_bd periodic,periodic -dm_localize_height 0

which shows incorrect edges and

./dmplex -malloc_debug 0 -dm_plex_box_upper 10,10 -dm_plex_box_faces 4,4
-dm_plex_simplex 0 -dm_view ::ascii_info_detail -draw_pause 3
-dm_plex_box_bd periodic,periodic -dm_localize_height 1

which is correct. If you want to control things yourself, instead of using
the option you can call DMPlexSetMaxProjectionHeight() on the coordinate DM
yourself.

  Thanks,

 Matt


>   Thanks,
>
> Matt
>
>
>> Thanks
>> praveen
>>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> 
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


dmplex.c
Description: Binary data


Re: [petsc-users] Saving solution with monitor function

2022-12-13 Thread Guglielmo, Tyler Hardy via petsc-users
Thanks guys,

Yes, this looks like exactly what I need.  Appreciate everyone’s help!

Best,
Tyler

From: Barry Smith 
Date: Tuesday, December 13, 2022 at 9:56 AM
To: hongzh...@anl.gov 
Cc: Guglielmo, Tyler Hardy , PETSc users list 

Subject: Re: [petsc-users] Saving solution with monitor function


  It is also possible to read the solutions back from the trajectory object 
from your running code. It is not just for saving to files.



On Dec 13, 2022, at 12:51 PM, Zhang, Hong via petsc-users 
 wrote:

Tyler,

The quickest solution is to use TSTrajectory as Matt mentioned. You can add the 
following command line options to save the solution into a binary file under a 
folder at each time step.

-ts_save_trajectory -ts_trajectory_type visualization

The folder name and the file name can be customized with -ts_trajectory_dirname 
and -ts_trajectory_file_template.

If you want to load these files into Matlab, you can use some scripts in 
share/petsc/matlab/ such as PetscReadBinaryTrajectory.m and PetscBinaryRead.m.

The python versions of these scripts are available in lib/petsc/bin/.

Hong(Mr.)


On Dec 13, 2022, at 12:14 AM, Guglielmo, Tyler Hardy via petsc-users 
mailto:petsc-users@mcs.anl.gov>> wrote:

Hi all,

I am a new PETSc user (and new to MPI in general), and was wondering if someone 
could help me out with what I am sure is a basic question (if this is not the 
appropriate email list or there is a better place please let me know!).

Basically, I am writing a code that requires a solution to an ODE that will be 
used later on during runtime.  I have written the basic ODE solver using TSRK, 
however I haven’t thought of a good way to store the actual solution at all 
time steps throughout the time evolution.  I would like to avoid writing each 
time step to a file through the monitor function, and instead just plug each 
time step into an array.

How is this usually done?  I suppose the user defined struct that gets passed 
into the monitor function could contain a pointer to an array in main?  This is 
how I would do this if the program wasn’t of the MPI variety, but I am not sure 
how to properly declare a pointer to an array declared as Vec and built through 
the usual PETSc process.  Any tips are greatly appreciated!

Thanks for your time,
Tyler

+
Tyler Guglielmo
Postdoctoral Researcher
Lawrence Livermore National Lab
Office: 925-423-6186
Cell: 210-480-8000
+




Re: [petsc-users] Saving solution with monitor function

2022-12-13 Thread Barry Smith


  It is also possible to read the solutions back from the trajectory object 
from your running code. It is not just for saving to files.


> On Dec 13, 2022, at 12:51 PM, Zhang, Hong via petsc-users 
>  wrote:
> 
> Tyler,
> 
> The quickest solution is to use TSTrajectory as Matt mentioned. You can add 
> the following command line options to save the solution into a binary file 
> under a folder at each time step.
> 
> -ts_save_trajectory -ts_trajectory_type visualization 
> 
> The folder name and the file name can be customized with 
> -ts_trajectory_dirname and -ts_trajectory_file_template.
> 
> If you want to load these files into Matlab, you can use some scripts in 
> share/petsc/matlab/ such as PetscReadBinaryTrajectory.m and PetscBinaryRead.m.
> 
> The python versions of these scripts are available in lib/petsc/bin/.
> 
> Hong(Mr.)
> 
>> On Dec 13, 2022, at 12:14 AM, Guglielmo, Tyler Hardy via petsc-users 
>> mailto:petsc-users@mcs.anl.gov>> wrote:
>> 
>> Hi all,
>>  
>> I am a new PETSc user (and new to MPI in general), and was wondering if 
>> someone could help me out with what I am sure is a basic question (if this 
>> is not the appropriate email list or there is a better place please let me 
>> know!).
>>  
>> Basically, I am writing a code that requires a solution to an ODE that will 
>> be used later on during runtime.  I have written the basic ODE solver using 
>> TSRK, however I haven’t thought of a good way to store the actual solution 
>> at all time steps throughout the time evolution.  I would like to avoid 
>> writing each time step to a file through the monitor function, and instead 
>> just plug each time step into an array.
>>  
>> How is this usually done?  I suppose the user defined struct that gets 
>> passed into the monitor function could contain a pointer to an array in 
>> main?  This is how I would do this if the program wasn’t of the MPI variety, 
>> but I am not sure how to properly declare a pointer to an array declared as 
>> Vec and built through the usual PETSc process.  Any tips are greatly 
>> appreciated!
>>  
>> Thanks for your time,
>> Tyler
>>  
>> +
>> Tyler Guglielmo
>> Postdoctoral Researcher
>> Lawrence Livermore National Lab
>> Office: 925-423-6186
>> Cell: 210-480-8000
>> +
> 



Re: [petsc-users] Saving solution with monitor function

2022-12-13 Thread Zhang, Hong via petsc-users
Tyler,

The quickest solution is to use TSTrajectory as Matt mentioned. You can add the 
following command line options to save the solution into a binary file under a 
folder at each time step.

-ts_save_trajectory -ts_trajectory_type visualization

The folder name and the file name can be customized with -ts_trajectory_dirname 
and -ts_trajectory_file_template.

If you want to load these files into Matlab, you can use some scripts in 
share/petsc/matlab/ such as PetscReadBinaryTrajectory.m and PetscBinaryRead.m.

The python versions of these scripts are available in lib/petsc/bin/.

Hong(Mr.)

On Dec 13, 2022, at 12:14 AM, Guglielmo, Tyler Hardy via petsc-users 
mailto:petsc-users@mcs.anl.gov>> wrote:

Hi all,

I am a new PETSc user (and new to MPI in general), and was wondering if someone 
could help me out with what I am sure is a basic question (if this is not the 
appropriate email list or there is a better place please let me know!).

Basically, I am writing a code that requires a solution to an ODE that will be 
used later on during runtime.  I have written the basic ODE solver using TSRK, 
however I haven’t thought of a good way to store the actual solution at all 
time steps throughout the time evolution.  I would like to avoid writing each 
time step to a file through the monitor function, and instead just plug each 
time step into an array.

How is this usually done?  I suppose the user defined struct that gets passed 
into the monitor function could contain a pointer to an array in main?  This is 
how I would do this if the program wasn’t of the MPI variety, but I am not sure 
how to properly declare a pointer to an array declared as Vec and built through 
the usual PETSc process.  Any tips are greatly appreciated!

Thanks for your time,
Tyler

+
Tyler Guglielmo
Postdoctoral Researcher
Lawrence Livermore National Lab
Office: 925-423-6186
Cell: 210-480-8000
+



Re: [petsc-users] dmplex normal vector incorrect for periodic gmsh grids

2022-12-13 Thread Matthew Knepley
On Tue, Dec 13, 2022 at 6:11 AM Praveen C  wrote:

> Hello
>
> In the attached test, I read a small grid made in gmsh with periodic bc.
>
> This is a 2d mesh.
>
> The cell numbers are shown in the figure.
>
> All faces have length = 2.5
>
> But using PetscFVFaceGeom I am getting length of 7.5 for some faces. E.g.,
>
> face: 59, centroid = 3.75, 2.50, normal = 0.00, -7.50
>
> ===> Face length incorrect = 7.50, should be 2.5
>
> support[0] = 11, cent = 8.75, 3.75, area = 6.25
>
> support[1] = 15, cent = 8.75, 1.25, area = 6.25
>
>
> There are also errors in the orientation of normal.
>
> If we disable periodicity in geo file, this error goes away.
>

Yes, by default we only localize coordinates for cells. I can put in code
to localize faces.

  Thanks,

Matt


> Thanks
> praveen
>
-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] parallelize matrix assembly process

2022-12-13 Thread Barry Smith

"MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 73239"

The preallocation is VERY wrong. This is why the computation is so slow; this 
number should be zero. 



> On Dec 12, 2022, at 10:20 PM, 김성익  wrote:
> 
> Following your comments, 
> I checked by using '-info'.
> 
> As you suspected, most elements being computed on wrong MPI rank.
> Also, there are a lot of stashed entries.
> 
> 
> 
> Should I divide the domain from the problem define stage?
> Or is a proper preallocation sufficient?
> 
> 
> 
> [0]  PetscCommDuplicate(): Duplicating a communicator 139687279637472 
> 94370404729840 max tags = 2147483647
> 
> [1]  PetscCommDuplicate(): Duplicating a communicator 139620736898016 
> 94891084133376 max tags = 2147483647
> 
> [0]  MatSetUp(): Warning not preallocating matrix storage
> 
> [1]  PetscCommDuplicate(): Duplicating a communicator 139620736897504 
> 94891083133744 max tags = 2147483647
> 
> [0]  PetscCommDuplicate(): Duplicating a communicator 139687279636960 
> 94370403730224 max tags = 2147483647
> 
> [1]  PetscCommDuplicate(): Using internal PETSc communicator 
> 139620736897504 94891083133744
> 
> [0]  PetscCommDuplicate(): Using internal PETSc communicator 
> 139687279636960 94370403730224
> 
> [1]  PetscCommDuplicate(): Using internal PETSc communicator 
> 139620736898016 94891084133376
> 
> [0]  PetscCommDuplicate(): Using internal PETSc communicator 
> 139687279637472 94370404729840
> 
> [1]  PetscCommDuplicate(): Using internal PETSc communicator 
> 139620736898016 94891084133376
> 
> [0]  PetscCommDuplicate(): Using internal PETSc communicator 
> 139687279637472 94370404729840
> 
> [1]  PetscCommDuplicate(): Using internal PETSc communicator 
> 139620736898016 94891084133376
> 
> [0]  PetscCommDuplicate(): Using internal PETSc communicator 
> 139687279637472 94370404729840
> 
> [0]  PetscCommDuplicate(): Using internal PETSc communicator 
> 139687279637472 94370404729840
> 
> [1]  PetscCommDuplicate(): Using internal PETSc communicator 
> 139620736898016 94891084133376
> 
> [1]  PetscCommDuplicate(): Using internal PETSc communicator 
> 139620736898016 94891084133376
> 
> [0]  PetscCommDuplicate(): Using internal PETSc communicator 
> 139687279637472 94370404729840
> 
>  TIME0 : 0.00
> 
>  TIME0 : 0.00
> 
> [0]  VecAssemblyBegin_MPI_BTS(): Stash has 661 entries, uses 8 mallocs.
> 
> [0]  VecAssemblyBegin_MPI_BTS(): Block-Stash has 0 entries, uses 0 
> mallocs.
> 
> [0]  VecAssemblyBegin_MPI_BTS(): Stash has 661 entries, uses 5 mallocs.
> 
> [0]  VecAssemblyBegin_MPI_BTS(): Block-Stash has 0 entries, uses 0 
> mallocs.
> 
> [0]  MatAssemblyBegin_MPIAIJ(): Stash has 460416 entries, uses 5 mallocs.
> 
> [1]  MatAssemblyBegin_MPIAIJ(): Stash has 461184 entries, uses 5 mallocs.
> 
> [0]  MatAssemblyEnd_SeqAIJ(): Matrix size: 13892 X 13892; storage space: 
> 180684 unneeded,987406 used
> 
> [0]  MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 
> 73242
> 
> [0]  MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 81
> 
> [0]  MatCheckCompressedRow(): Found the ratio (num_zerorows 
> 0)/(num_localrows 13892) < 0.6. Do not use CompressedRow routines.
> 
> [0]  MatSeqAIJCheckInode(): Found 4631 nodes of 13892. Limit used: 5. 
> Using Inode routines
> 
> [1]  MatAssemblyEnd_SeqAIJ(): Matrix size: 13891 X 13891; storage space: 
> 180715 unneeded,987325 used
> 
> [1]  MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 
> 73239
> 
> [1]  MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 81
> 
> [1]  MatCheckCompressedRow(): Found the ratio (num_zerorows 
> 0)/(num_localrows 13891) < 0.6. Do not use CompressedRow routines.
> 
> [1]  MatSeqAIJCheckInode(): Found 4631 nodes of 13891. Limit used: 5. 
> Using Inode routines
> 
> [0]  PetscCommDuplicate(): Using internal PETSc communicator 
> 139687279636960 94370403730224
> 
> [0]  PetscCommDuplicate(): Using internal PETSc communicator 
> 139687279636960 94370403730224
> 
> [1]  PetscCommDuplicate(): Using internal PETSc communicator 
> 139620736897504 94891083133744
> 
> [1]  PetscCommDuplicate(): Using internal PETSc communicator 
> 139620736897504 94891083133744
> 
> [0]  MatAssemblyEnd_SeqAIJ(): Matrix size: 13892 X 1390; storage space: 
> 72491 unneeded,34049 used
> 
> [0]  MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 
> 2472
> 
> [0]  MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 40
> 
> [0]  MatCheckCompressedRow(): Found the ratio (num_zerorows 
> 12501)/(num_localrows 13892) > 0.6. Use CompressedRow routines.
> 
> Assemble Time : 174.079366sec
> 
> [0]  PetscCommDuplicate(): Using internal PETSc communicator 
> 139687279636960 94370403730224
> 
> [0]  PetscCommDuplicate(): Using internal PETSc communicator 
> 139687279636960 94370403730224
> 
> [1]  MatAssemblyEnd_SeqAIJ(): Matrix size: 13891 X 1391; storage space: 
> 72441 unneeded,34049 used
> 
> [1]  MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 
> 2469
> 
> [1]  

Re: [petsc-users] Saving solution with monitor function

2022-12-13 Thread Matthew Knepley
On Tue, Dec 13, 2022 at 8:40 AM Guglielmo, Tyler Hardy via petsc-users <
petsc-users@mcs.anl.gov> wrote:

> Hi all,
>
>
>
> I am a new PETSc user (and new to MPI in general), and was wondering if
> someone could help me out with what I am sure is a basic question (if this
> is not the appropriate email list or there is a better place please let me
> know!).
>
>
>
> Basically, I am writing a code that requires a solution to an ODE that
> will be used later on during runtime.  I have written the basic ODE solver
> using TSRK, however I haven’t thought of a good way to store the actual
> solution at all time steps throughout the time evolution.  I would like to
> avoid writing each time step to a file through the monitor function, and
> instead just plug each time step into an array.
>
>
>
> How is this usually done?  I suppose the user defined struct that gets
> passed into the monitor function could contain a pointer to an array in
> main?  This is how I would do this if the program wasn’t of the MPI
> variety, but I am not sure how to properly declare a pointer to an array
> declared as Vec and built through the usual PETSc process.  Any tips are
> greatly appreciated
>

I think this is what TSTrajectory is for. I believe you want
https://petsc.org/main/docs/manualpages/TS/TSTRAJECTORYMEMORY/

  Thanks,

  Matt


> Thanks for your time,
>
> Tyler
>
>
>
> +
>
> Tyler Guglielmo
>
> Postdoctoral Researcher
>
> Lawrence Livermore National Lab
>
> Office: 925-423-6186
>
> Cell: 210-480-8000
>
> +
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] Seg fault in gdb but program runs

2022-12-13 Thread Matthew Knepley
On Tue, Dec 13, 2022 at 1:52 AM Yuyun Yang 
wrote:

> Ok I’ll check that, thanks for taking a look! By the way, when I reduce
> the domain size this error doesn’t appear anymore, so I don’t know whether
> gdb just cannot handle the memory, and start to cut things off which is
> causing the seg fault.
>

It is not the size, it is your arguments. See here

  mat=@0x55927e10: 0x57791bb0, diag=5, offDiag=0)

  Thanks,

 Matt


>
>
> *From: *Matthew Knepley 
> *Date: *Tuesday, December 13, 2022 at 2:49 PM
> *To: *Yuyun Yang 
> *Cc: *petsc-users 
> *Subject: *Re: [petsc-users] Seg fault in gdb but program runs
>
> On Tue, Dec 13, 2022 at 1:14 AM Yuyun Yang 
> wrote:
>
> Here is the error message:
>
>
>
> Program received signal SIGSEGV, Segmentation fault.
> 0x555e73b7 in kronConvert (left=..., right=...,
> mat=@0x55927e10: 0x57791bb0, diag=5, offDiag=0)
> at /home/yuyun/scycle-2/source/spmat.cpp:265
> 265  kronConvert_symbolic(left,right,mat,d_nnz,o_nnz);
>
>
>
> d_nnz and o_nnz are pointers, and they are supposed to hold arrays of the
> number of nonzero in each row,
>
> You seem to be passing integers.
>
>
>
>   Thanks,
>
>
>
>  Matt
>
>
>
> On Tue, Dec 13, 2022 at 12:41 PM Matthew Knepley 
> wrote:
>
> On Mon, Dec 12, 2022 at 9:56 PM Yuyun Yang 
> wrote:
>
> Hello team,
>
>
>
> I’m debugging my code using gdb. The program runs just fine if I don’t
> debug it, but when I use gdb, it seg faults at a place where it never
> experienced any seg fault when I debugged it 1-2 years ago. I wonder if
> this might be caused by the PETSc version change?
>
>
>
> The only PETSc calls are the MatGetOwnershipRange() calls, which have not
> changed, so I think this is unlikely.
>
>
>
> Or something wrong with gdb itself? I’ve included the code block that is
> problematic for you to take a look at what might be wrong – seg fault
> happens when this function is called. For context, Spmat is a class of
> sparse matrices in the code:
>
>
>
> What is the debugger output?
>
>
>
>   Thanks,
>
>
>
> Matt
>
>
>
> // calculate the exact nonzero structure which results from the kronecker
> outer product of left and right
>
>
> // d_nnz = diagonal nonzero structure, o_nnz = off-diagonal nonzero
> structure
>
> void kronConvert_symbolic(const Spmat , const Spmat , Mat ,
> PetscInt* d_nnz, PetscInt* o_nnz)
>
>
> {
>
>
>   size_t rightRowSize = right.size(1);
>
>
>   size_t rightColSize = right.size(2);
>
>
>
>
>
>   PetscInt Istart,Iend; // rows owned by current processor
>
>
>   PetscInt Jstart,Jend; // cols owned by current processor
>
>
>
>
>
>   // allocate space for mat
>
>
>   MatGetOwnershipRange(mat,,);
>
>
>   MatGetOwnershipRangeColumn(mat,,);
>
>
>   PetscInt m = Iend - Istart;
>
>
>
>
>
>   for (int ii=0; ii
>
>   for (int ii=0; ii
>
>
>
>
>   // iterate over only nnz entries
>
>
>   Spmat::const_row_iter IiL,IiR;
>
>
>   Spmat::const_col_iter JjL,JjR;
>
>
>   double valL=0, valR=0, val=0;
>
>
>   PetscInt row,col;
>
>
>   size_t rowL,colL,rowR,colR;
>
>
>
>
>   // loop over all values in left
>
>
>   for (IiL=left._mat.begin(); IiL!=left._mat.end(); IiL++) {
>
>
> for (JjL=(IiL->second).begin(); JjL!=(IiL->second).end(); JjL++) {
>
>
>   rowL = IiL->first;
>
>
>   colL = JjL->first;
>
>
>   valL = JjL->second;
>
>
>   if (valL==0) { continue; }
>
>
>
>
>
>   // loop over all values in right
>
>
>   for (IiR=right._mat.begin(); IiR!=right._mat.end(); IiR++) {
>
>
> for (JjR=(IiR->second).begin(); JjR!=(IiR->second).end(); JjR++)
> {
>
>   rowR = IiR->first;
>
>
>   colR = JjR->first;
>
>
>   valR = JjR->second;
>
>
>
>
>
>   // the new values and coordinates for the product matrix
>
>
>   val = valL*valR;
>
>
>   row = rowL*rightRowSize + rowR;
>
>
>   col = colL*rightColSize + colR;
>
>
>
>
>
>   PetscInt ii = row - Istart; // array index for d_nnz and o_nnz
>
>
>   if (val!=0 && row >= Istart && row < Iend && col >= Jstart &&
> col < Jend) { d_nnz[ii]++; \
>
> }
>
>
>   if ( (val!=0 && row >= Istart && row < Iend) && (col < Jstart
> || col >= Jend) ) { o_nnz[i\
>
> i]++; }
>
>
> }
>
>
>   }
>
>
> }
>
>
>   }
>
>
> }
>
>
>
>
>
>
>
> Thank you,
>
> Yuyun
>
>
>
>
> --
>
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
>
>
> https://www.cse.buffalo.edu/~knepley/
> 
>
>
>
>
> --
>
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
>
>
> https://www.cse.buffalo.edu/~knepley/
> 
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to 

[petsc-users] Saving solution with monitor function

2022-12-13 Thread Guglielmo, Tyler Hardy via petsc-users
Hi all,

I am a new PETSc user (and new to MPI in general), and was wondering if someone 
could help me out with what I am sure is a basic question (if this is not the 
appropriate email list or there is a better place please let me know!).

Basically, I am writing a code that requires a solution to an ODE that will be 
used later on during runtime.  I have written the basic ODE solver using TSRK, 
however I haven’t thought of a good way to store the actual solution at all 
time steps throughout the time evolution.  I would like to avoid writing each 
time step to a file through the monitor function, and instead just plug each 
time step into an array.

How is this usually done?  I suppose the user defined struct that gets passed 
into the monitor function could contain a pointer to an array in main?  This is 
how I would do this if the program wasn’t of the MPI variety, but I am not sure 
how to properly declare a pointer to an array declared as Vec and built through 
the usual PETSc process.  Any tips are greatly appreciated!

Thanks for your time,
Tyler

+
Tyler Guglielmo
Postdoctoral Researcher
Lawrence Livermore National Lab
Office: 925-423-6186
Cell: 210-480-8000
+


Re: [petsc-users] Seg fault in gdb but program runs

2022-12-13 Thread Satish Balay via petsc-users
suggest running your code with valgrind

Satish

On Mon, 12 Dec 2022, Matthew Knepley wrote:

> On Mon, Dec 12, 2022 at 9:56 PM Yuyun Yang 
> wrote:
> 
> > Hello team,
> >
> >
> >
> > I’m debugging my code using gdb. The program runs just fine if I don’t
> > debug it, but when I use gdb, it seg faults at a place where it never
> > experienced any seg fault when I debugged it 1-2 years ago. I wonder if
> > this might be caused by the PETSc version change?
> >
> 
> The only PETSc calls are the MatGetOwnershipRange() calls, which have not
> changed, so I think this is unlikely.
> 
> 
> > Or something wrong with gdb itself? I’ve included the code block that is
> > problematic for you to take a look at what might be wrong – seg fault
> > happens when this function is called. For context, Spmat is a class of
> > sparse matrices in the code:
> >
> 
> What is the debugger output?
> 
>   Thanks,
> 
> Matt
> 
> 
> > // calculate the exact nonzero structure which results from the kronecker
> > outer product of left and right
> >
> >
> > // d_nnz = diagonal nonzero structure, o_nnz = off-diagonal nonzero
> > structure
> >
> > void kronConvert_symbolic(const Spmat , const Spmat , Mat ,
> > PetscInt* d_nnz, PetscInt* o_nnz)
> >
> >
> > {
> >
> >
> >   size_t rightRowSize = right.size(1);
> >
> >
> >   size_t rightColSize = right.size(2);
> >
> >
> >
> >
> >
> >   PetscInt Istart,Iend; // rows owned by current processor
> >
> >
> >   PetscInt Jstart,Jend; // cols owned by current processor
> >
> >
> >
> >
> >
> >   // allocate space for mat
> >
> >
> >   MatGetOwnershipRange(mat,,);
> >
> >
> >   MatGetOwnershipRangeColumn(mat,,);
> >
> >
> >   PetscInt m = Iend - Istart;
> >
> >
> >
> >
> >
> >   for (int ii=0; ii >
> >
> >   for (int ii=0; ii >
> >
> >
> >
> >
> >   // iterate over only nnz entries
> >
> >
> >   Spmat::const_row_iter IiL,IiR;
> >
> >
> >   Spmat::const_col_iter JjL,JjR;
> >
> >
> >   double valL=0, valR=0, val=0;
> >
> >
> >   PetscInt row,col;
> >
> >
> >   size_t rowL,colL,rowR,colR;
> >
> >
> >
> >
> >   // loop over all values in left
> >
> >
> >   for (IiL=left._mat.begin(); IiL!=left._mat.end(); IiL++) {
> >
> >
> > for (JjL=(IiL->second).begin(); JjL!=(IiL->second).end(); JjL++) {
> >
> >
> >   rowL = IiL->first;
> >
> >
> >   colL = JjL->first;
> >
> >
> >   valL = JjL->second;
> >
> >
> >   if (valL==0) { continue; }
> >
> >
> >
> >
> >
> >   // loop over all values in right
> >
> >
> >   for (IiR=right._mat.begin(); IiR!=right._mat.end(); IiR++) {
> >
> >
> > for (JjR=(IiR->second).begin(); JjR!=(IiR->second).end(); JjR++)
> > {
> >
> >   rowR = IiR->first;
> >
> >
> >   colR = JjR->first;
> >
> >
> >   valR = JjR->second;
> >
> >
> >
> >
> >
> >   // the new values and coordinates for the product matrix
> >
> >
> >   val = valL*valR;
> >
> >
> >   row = rowL*rightRowSize + rowR;
> >
> >
> >   col = colL*rightColSize + colR;
> >
> >
> >
> >
> >
> >   PetscInt ii = row - Istart; // array index for d_nnz and o_nnz
> >
> >
> >   if (val!=0 && row >= Istart && row < Iend && col >= Jstart &&
> > col < Jend) { d_nnz[ii]++; \
> >
> > }
> >
> >
> >   if ( (val!=0 && row >= Istart && row < Iend) && (col < Jstart
> > || col >= Jend) ) { o_nnz[i\
> >
> > i]++; }
> >
> >
> > }
> >
> >
> >   }
> >
> >
> > }
> >
> >
> >   }
> >
> >
> > }
> >
> >
> >
> >
> >
> >
> >
> > Thank you,
> >
> > Yuyun
> >
> 
> 
> 


Re: [petsc-users] realCoords for DOFs

2022-12-13 Thread Yann Jobic

That a really smart trick !
Thanks for sharing it.
I previously looked closeley to the DMProjectFunction without success 
yet as it has the solution, but yours is just too good.

Yann

Le 12/13/2022 à 1:52 PM, Mark Adams a écrit :
You should be able to use PetscFECreateDefault instead of 
PetscFECreateLagrange here and set the "order" on the command 
line, which is recommended, but start with this and low order to debug.


Mark

static PetscErrorCode crd_func(PetscInt dim, PetscReal time, const 
PetscReal x[], PetscInt Nf_dummy, PetscScalar *u, void *actx)

{
   int i;
   PetscFunctionBeginUser;
   for (i = 0; i < dim; ++i) u[i] = x[i];
   PetscFunctionReturn(0);
}

   PetscErrorCode (*initu[1])(PetscInt, PetscReal, const PetscReal [], 
PetscInt, PetscScalar [], void *);

   PetscFE       fe;

   /* create coordinate DM */
   ierr = DMClone(dm, );CHKERRV(ierr);
   ierr = PetscFECreateLagrange(PETSC_COMM_SELF, dim, dim, PETSC_FALSE, 
order, PETSC_DECIDE, );CHKERRV(ierr);

   ierr = PetscFESetFromOptions(fe);CHKERRV(ierr);
   ierr = DMSetField(crddm, 0, NULL, (PetscObject)fe);CHKERRV(ierr);
   ierr = DMCreateDS(crddm);CHKERRV(ierr);
   ierr = PetscFEDestroy();CHKERRV(ierr);
   /* project coordinates to vertices */
   ierr = DMCreateGlobalVector(crddm, _vec);CHKERRV(ierr);
   initu[0] = crd_func;
   ierr = DMProjectFunction(crddm, 0.0, initu, NULL, INSERT_ALL_VALUES, 
crd_vec);CHKERRV(ierr);

   ierr = VecViewFromOptions(crd_vec, NULL, "-coord_view");CHKERRV(ierr);

On Tue, Dec 13, 2022 at 3:38 AM Yann Jobic > wrote:




Le 12/13/2022 à 2:04 AM, Mark Adams a écrit :
 > PETSc does not store the coordinates for high order elements (ie,
the
 > "midside nodes").
 > I get them by projecting a f(x) = x, function.
 > Not sure if that is what you want but I can give you a code
snippet if
 > there are no better ideas.

It could really help me !
If i have the node coordinates in the reference element, then it's easy
to project them to the real space. But i couldn't find a way to have
them, so if you could give me some guidance, it could be really helpful.
Thanks,
Yann

 >
 > Mark
 >
 >
 > On Mon, Dec 12, 2022 at 6:06 PM Yann Jobic
mailto:yann.jo...@univ-amu.fr>
 > >>
wrote:
 >
 >     Hi all,
 >
 >     I'm trying to get the coords of the dofs of a DMPlex for a
PetscFE
 >     discretization, for orders greater than 1.
 >
 >     I'm struggling to run dm/impls/plex/tutorials/ex8.c
 >     I've got the following error (with option -view_coord) :
 >
 >     [0]PETSC ERROR: - Error Message
 >     --
 >     [0]PETSC ERROR: Object is in wrong state
 >     [0]PETSC ERROR: DMGetCoordinatesLocalSetUp() has not been called
 >     [0]PETSC ERROR: See https://petsc.org/release/faq/

 >     > for trouble shooting.
 >     [0]PETSC ERROR: Petsc Release Version 3.18.2, unknown
 >     [0]PETSC ERROR:
 >   
  /home/jobic/projet/fe-utils/petsc/fetools/cmake-build-debug/ex_closure_petsc

 >     on a  named luke by jobic Mon Dec 12 23:34:37 2022
 >     [0]PETSC ERROR: Configure options
 >     --prefix=/local/lib/petsc/3.18/p3/gcc/nompi_hdf5 --with-mpi=0
 >     --with-debugging=1 --with-blacs=1
--download-zlib,--download-p4est
 >     --download-hdf5=1 --download-triangle=1 --with-single-library=0
 >     --with-large-file-io=1 --with-shared-libraries=0 -CFLAGS=" -g
-O0"
 >     -CXXFLAGS=" -g -O0" -FFLAGS=" -g -O0" PETSC_ARCH=nompi_gcc_hdf5
 >     [0]PETSC ERROR: #1 DMGetCoordinatesLocalNoncollective() at
 >   
  /home/devel/src_linux/petsc-3.18.0/src/dm/interface/dmcoordinates.c:621

 >     [0]PETSC ERROR: #2 DMPlexGetCellCoordinates() at
 >   
  /home/devel/src_linux/petsc-3.18.0/src/dm/impls/plex/plexgeometry.c:1291

 >     [0]PETSC ERROR: #3 main() at
 >   
  /home/jobic/projet/fe-utils/petsc/fetools/src/ex_closure_petsc.c:86

 >     [0]PETSC ERROR: PETSc Option Table entries:
 >     [0]PETSC ERROR: -dm_plex_box_faces 2,2
 >     [0]PETSC ERROR: -dm_plex_dim 2
 >     [0]PETSC ERROR: -dm_plex_simplex 0
 >     [0]PETSC ERROR: -petscspace_degree 1
 >     [0]PETSC ERROR: -view_coord
 >     [0]PETSC ERROR: End of Error Message
---send entire
 >     error message to petsc-ma...@mcs.anl.gov--
 >
 >     Maybe i've done something wrong ?
 >
 >
 >     Moreover, i don't quite understand the function
DMPlexGetLocalOffsets,
 >     and how to use it with DMGetCoordinatesLocal. It seems that
 >     DMGetCoordinatesLocal do not have the coords of the dofs, 

Re: [petsc-users] realCoords for DOFs

2022-12-13 Thread Mark Adams
You should be able to use PetscFECreateDefault instead of
PetscFECreateLagrange here and set the "order" on the command line, which
is recommended, but start with this and low order to debug.

Mark

static PetscErrorCode crd_func(PetscInt dim, PetscReal time, const
PetscReal x[], PetscInt Nf_dummy, PetscScalar *u, void *actx)
{
  int i;
  PetscFunctionBeginUser;
  for (i = 0; i < dim; ++i) u[i] = x[i];
  PetscFunctionReturn(0);
}

  PetscErrorCode (*initu[1])(PetscInt, PetscReal, const PetscReal [],
PetscInt, PetscScalar [], void *);
  PetscFE   fe;

  /* create coordinate DM */
  ierr = DMClone(dm, );CHKERRV(ierr);
  ierr = PetscFECreateLagrange(PETSC_COMM_SELF, dim, dim, PETSC_FALSE,
order, PETSC_DECIDE, );CHKERRV(ierr);
  ierr = PetscFESetFromOptions(fe);CHKERRV(ierr);
  ierr = DMSetField(crddm, 0, NULL, (PetscObject)fe);CHKERRV(ierr);
  ierr = DMCreateDS(crddm);CHKERRV(ierr);
  ierr = PetscFEDestroy();CHKERRV(ierr);
  /* project coordinates to vertices */
  ierr = DMCreateGlobalVector(crddm, _vec);CHKERRV(ierr);
  initu[0] = crd_func;
  ierr = DMProjectFunction(crddm, 0.0, initu, NULL, INSERT_ALL_VALUES,
crd_vec);CHKERRV(ierr);
  ierr = VecViewFromOptions(crd_vec, NULL, "-coord_view");CHKERRV(ierr);

On Tue, Dec 13, 2022 at 3:38 AM Yann Jobic  wrote:

>
>
> Le 12/13/2022 à 2:04 AM, Mark Adams a écrit :
> > PETSc does not store the coordinates for high order elements (ie, the
> > "midside nodes").
> > I get them by projecting a f(x) = x, function.
> > Not sure if that is what you want but I can give you a code snippet if
> > there are no better ideas.
>
> It could really help me !
> If i have the node coordinates in the reference element, then it's easy
> to project them to the real space. But i couldn't find a way to have
> them, so if you could give me some guidance, it could be really helpful.
> Thanks,
> Yann
>
> >
> > Mark
> >
> >
> > On Mon, Dec 12, 2022 at 6:06 PM Yann Jobic  > > wrote:
> >
> > Hi all,
> >
> > I'm trying to get the coords of the dofs of a DMPlex for a PetscFE
> > discretization, for orders greater than 1.
> >
> > I'm struggling to run dm/impls/plex/tutorials/ex8.c
> > I've got the following error (with option -view_coord) :
> >
> > [0]PETSC ERROR: - Error Message
> > --
> > [0]PETSC ERROR: Object is in wrong state
> > [0]PETSC ERROR: DMGetCoordinatesLocalSetUp() has not been called
> > [0]PETSC ERROR: See https://petsc.org/release/faq/
> >  for trouble shooting.
> > [0]PETSC ERROR: Petsc Release Version 3.18.2, unknown
> > [0]PETSC ERROR:
> >
>  /home/jobic/projet/fe-utils/petsc/fetools/cmake-build-debug/ex_closure_petsc
> > on a  named luke by jobic Mon Dec 12 23:34:37 2022
> > [0]PETSC ERROR: Configure options
> > --prefix=/local/lib/petsc/3.18/p3/gcc/nompi_hdf5 --with-mpi=0
> > --with-debugging=1 --with-blacs=1 --download-zlib,--download-p4est
> > --download-hdf5=1 --download-triangle=1 --with-single-library=0
> > --with-large-file-io=1 --with-shared-libraries=0 -CFLAGS=" -g -O0"
> > -CXXFLAGS=" -g -O0" -FFLAGS=" -g -O0" PETSC_ARCH=nompi_gcc_hdf5
> > [0]PETSC ERROR: #1 DMGetCoordinatesLocalNoncollective() at
> >
>  /home/devel/src_linux/petsc-3.18.0/src/dm/interface/dmcoordinates.c:621
> > [0]PETSC ERROR: #2 DMPlexGetCellCoordinates() at
> >
>  /home/devel/src_linux/petsc-3.18.0/src/dm/impls/plex/plexgeometry.c:1291
> > [0]PETSC ERROR: #3 main() at
> > /home/jobic/projet/fe-utils/petsc/fetools/src/ex_closure_petsc.c:86
> > [0]PETSC ERROR: PETSc Option Table entries:
> > [0]PETSC ERROR: -dm_plex_box_faces 2,2
> > [0]PETSC ERROR: -dm_plex_dim 2
> > [0]PETSC ERROR: -dm_plex_simplex 0
> > [0]PETSC ERROR: -petscspace_degree 1
> > [0]PETSC ERROR: -view_coord
> > [0]PETSC ERROR: End of Error Message ---send
> entire
> > error message to petsc-ma...@mcs.anl.gov--
> >
> > Maybe i've done something wrong ?
> >
> >
> > Moreover, i don't quite understand the function
> DMPlexGetLocalOffsets,
> > and how to use it with DMGetCoordinatesLocal. It seems that
> > DMGetCoordinatesLocal do not have the coords of the dofs, but only
> the
> > nodes defining the geometry.
> >
> > I've made some small modifications of ex8.c, but i still have an
> error :
> > [0]PETSC ERROR: - Error Message
> > --
> > [0]PETSC ERROR: Invalid argument
> > [0]PETSC ERROR: Wrong type of object: Parameter # 1
> > [0]PETSC ERROR: WARNING! There are option(s) set that were not used!
> > Could be the program crashed before they were used or a spelling
> > mistake, etc!
> > [0]PETSC ERROR: Option left: name:-sol value: vtk:sol.vtu
> > [0]PETSC ERROR: See https://petsc.org/release/faq/
> >  

Re: [petsc-users] Insert one sparse matrix as a block in another

2022-12-13 Thread Peder Jørgensgaard Olesen via petsc-users
Yes, MATNEST seems do to just what I wanted, and extremely fast too. Thanks!


For context, I am doing a snapshot POD (like SVD, but less memory intensive) in 
which I'm building a dense matrix S_pq = u_p^T B u_q for the EVP Sa=λa, where 
{u_p} and {u_q} are vectors in a particular dataset. The kernel B is the sparse 
matrix I was asking about.


Thanks again,

Peder


Fra: Matthew Knepley 
Sendt: 12. december 2022 23:58:02
Til: Jed Brown
Cc: Mark Adams; Peder Jørgensgaard Olesen; petsc-users@mcs.anl.gov
Emne: Re: [petsc-users] Insert one sparse matrix as a block in another

On Mon, Dec 12, 2022 at 5:24 PM Jed Brown 
mailto:j...@jedbrown.org>> wrote:
The description matches MATNEST (MATCOMPOSITE is for a sum or product of 
matrices) or parallel decompositions. Also consider the assembly style of 
src/snes/tutorials/ex28.c, which can create either a monolithic or block 
(MATNEST) matrix without extra storage or conversion costs.

I will just say a few words about ex28. The idea is that if you are already 
calling MatSetValues() to assemble your submatrices, then
you can use MatSetValuesLocal() to remap those locations into locations in the 
large matrix, using a LocalToGlobalMap. This allows
you to choose either a standard AIJ matrix (which supports factorizations for 
example), or a MatNest object that supports fast extraction
of the blocks.

  Thanks,

Matt

Mark Adams mailto:mfad...@lbl.gov>> writes:

> Do you know what kind of solver works well for this problem?
>
> You probably want to figure that out first and not worry about efficiency.
>
> MATCOMPOSITE does what you want but not all solvers will work with it.
>
> Where does this problem come from? We have a lot of experience and might
> know something.
>
> Mark
>
> On Mon, Dec 12, 2022 at 1:33 PM Peder Jørgensgaard Olesen via petsc-users <
> petsc-users@mcs.anl.gov> wrote:
>
>> Hello
>>
>>
>> I have a set of sparse matrices (A1, A2, ...) , and need to generate a
>> larger matrix B with these as submatrices. I do not know the precise sparse
>> layouts of the A's (only that each row has one or two non-zero values),
>> and extracting *all* values to copy into B seems incredibly wasteful. How
>> can I make use of the sparsity to solve this efficiently?
>>
>>
>> Thanks,
>>
>> Peder
>>
>>
>>
>> Peder Jørgensgaard Olesen
>> PhD student
>> Department of Civil and Mechanical Engineering
>>
>> pj...@mek.dtu.dk
>> Koppels Allé
>> Building 403, room 105
>> 2800 Kgs. Lyngby
>> www.dtu.dk/english
>>
>>


--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/


[petsc-users] dmplex normal vector incorrect for periodic gmsh grids

2022-12-13 Thread Praveen C
HelloIn the attached test, I read a small grid made in gmsh with periodic bc.This is a 2d mesh.The cell numbers are shown in the figure.All faces have length = 2.5But using PetscFVFaceGeom I am getting length of 7.5 for some faces. E.g.,face: 59, centroid = 3.75, 2.50, normal = 0.00, -7.50
===> Face length incorrect = 7.50, should be 2.5
support[0] = 11, cent = 8.75, 3.75, area = 6.25
support[1] = 15, cent = 8.75, 1.25, area = 6.25There are also errors in the orientation of normal.If we disable periodicity in geo file, this error goes away.Thankspraveen

dmplex.c
Description: Binary data


ug_periodic.geo
Description: Binary data


Re: [petsc-users] realCoords for DOFs

2022-12-13 Thread Yann Jobic




Le 12/13/2022 à 2:04 AM, Mark Adams a écrit :
PETSc does not store the coordinates for high order elements (ie, the 
"midside nodes").

I get them by projecting a f(x) = x, function.
Not sure if that is what you want but I can give you a code snippet if 
there are no better ideas.


It could really help me !
If i have the node coordinates in the reference element, then it's easy 
to project them to the real space. But i couldn't find a way to have 
them, so if you could give me some guidance, it could be really helpful.

Thanks,
Yann



Mark


On Mon, Dec 12, 2022 at 6:06 PM Yann Jobic > wrote:


Hi all,

I'm trying to get the coords of the dofs of a DMPlex for a PetscFE
discretization, for orders greater than 1.

I'm struggling to run dm/impls/plex/tutorials/ex8.c
I've got the following error (with option -view_coord) :

[0]PETSC ERROR: - Error Message
--
[0]PETSC ERROR: Object is in wrong state
[0]PETSC ERROR: DMGetCoordinatesLocalSetUp() has not been called
[0]PETSC ERROR: See https://petsc.org/release/faq/
 for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.18.2, unknown
[0]PETSC ERROR:
/home/jobic/projet/fe-utils/petsc/fetools/cmake-build-debug/ex_closure_petsc
on a  named luke by jobic Mon Dec 12 23:34:37 2022
[0]PETSC ERROR: Configure options
--prefix=/local/lib/petsc/3.18/p3/gcc/nompi_hdf5 --with-mpi=0
--with-debugging=1 --with-blacs=1 --download-zlib,--download-p4est
--download-hdf5=1 --download-triangle=1 --with-single-library=0
--with-large-file-io=1 --with-shared-libraries=0 -CFLAGS=" -g -O0"
-CXXFLAGS=" -g -O0" -FFLAGS=" -g -O0" PETSC_ARCH=nompi_gcc_hdf5
[0]PETSC ERROR: #1 DMGetCoordinatesLocalNoncollective() at
/home/devel/src_linux/petsc-3.18.0/src/dm/interface/dmcoordinates.c:621
[0]PETSC ERROR: #2 DMPlexGetCellCoordinates() at
/home/devel/src_linux/petsc-3.18.0/src/dm/impls/plex/plexgeometry.c:1291
[0]PETSC ERROR: #3 main() at
/home/jobic/projet/fe-utils/petsc/fetools/src/ex_closure_petsc.c:86
[0]PETSC ERROR: PETSc Option Table entries:
[0]PETSC ERROR: -dm_plex_box_faces 2,2
[0]PETSC ERROR: -dm_plex_dim 2
[0]PETSC ERROR: -dm_plex_simplex 0
[0]PETSC ERROR: -petscspace_degree 1
[0]PETSC ERROR: -view_coord
[0]PETSC ERROR: End of Error Message ---send entire
error message to petsc-ma...@mcs.anl.gov--

Maybe i've done something wrong ?


Moreover, i don't quite understand the function DMPlexGetLocalOffsets,
and how to use it with DMGetCoordinatesLocal. It seems that
DMGetCoordinatesLocal do not have the coords of the dofs, but only the
nodes defining the geometry.

I've made some small modifications of ex8.c, but i still have an error :
[0]PETSC ERROR: - Error Message
--
[0]PETSC ERROR: Invalid argument
[0]PETSC ERROR: Wrong type of object: Parameter # 1
[0]PETSC ERROR: WARNING! There are option(s) set that were not used!
Could be the program crashed before they were used or a spelling
mistake, etc!
[0]PETSC ERROR: Option left: name:-sol value: vtk:sol.vtu
[0]PETSC ERROR: See https://petsc.org/release/faq/
 for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.18.2, unknown
[0]PETSC ERROR:
/home/jobic/projet/fe-utils/petsc/fetools/cmake-build-debug/view_coords
on a  named luke by jobic Mon Dec 12 23:51:05 2022
[0]PETSC ERROR: Configure options
--prefix=/local/lib/petsc/3.18/p3/gcc/nompi_hdf5 --with-mpi=0
--with-debugging=1 --with-blacs=1 --download-zlib,--download-p4est
--download-hdf5=1 --download-triangle=1 --with-single-library=0
--with-large-file-io=1 --with-shared-libraries=0 -CFLAGS=" -g -O0"
-CXXFLAGS=" -g -O0" -FFLAGS=" -g -O0" PETSC_ARCH=nompi_gcc_hdf5
[0]PETSC ERROR: #1 PetscFEGetHeightSubspace() at
/home/devel/src_linux/petsc-3.18.0/src/dm/dt/fe/interface/fe.c:1692
[0]PETSC ERROR: #2 DMPlexGetLocalOffsets() at
/home/devel/src_linux/petsc-3.18.0/src/dm/impls/plex/plexceed.c:98
[0]PETSC ERROR: #3 ViewOffsets() at
/home/jobic/projet/fe-utils/petsc/fetools/src/view_coords.c:28
[0]PETSC ERROR: #4 main() at
/home/jobic/projet/fe-utils/petsc/fetools/src/view_coords.c:99
[0]PETSC ERROR: PETSc Option Table entries:
[0]PETSC ERROR: -heat_petscspace_degree 2
[0]PETSC ERROR: -sol vtk:sol.vtu
[0]PETSC ERROR: End of Error Message ---send entire
error message to petsc-ma...@mcs.anl.gov--


Is dm/impls/plex/tutorials/ex8.c a good example for viewing the coords
of the dofs of a DMPlex ?


Thanks,

Yann



Re: [petsc-users] realCoords for DOFs

2022-12-13 Thread Yann Jobic




Le 12/13/2022 à 5:43 AM, Matthew Knepley a écrit :
On Mon, Dec 12, 2022 at 6:06 PM Yann Jobic > wrote:


Hi all,

I'm trying to get the coords of the dofs of a DMPlex for a PetscFE
discretization, for orders greater than 1.

I'm struggling to run dm/impls/plex/tutorials/ex8.c
I've got the following error (with option -view_coord) :


You just need to call

   DMGetCoordinatesLocalSetUp()

before the loop. I try to indicate this in the error message(). I did 
not call it in the example
because it is only necessary for output. 


The error message is explicit. It feels strange to modify a Petsc 
tutorial in order to make it work, with an option proposed by it.

Maybe the option -view_coord should be removed then.
Thanks,
Yann


We do this so that output is

not synchronizing.

   Thanks,

     Matt

[0]PETSC ERROR: - Error Message
--
[0]PETSC ERROR: Object is in wrong state
[0]PETSC ERROR: DMGetCoordinatesLocalSetUp() has not been called
[0]PETSC ERROR: See https://petsc.org/release/faq/
 for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.18.2, unknown
[0]PETSC ERROR:
/home/jobic/projet/fe-utils/petsc/fetools/cmake-build-debug/ex_closure_petsc
on a  named luke by jobic Mon Dec 12 23:34:37 2022
[0]PETSC ERROR: Configure options
--prefix=/local/lib/petsc/3.18/p3/gcc/nompi_hdf5 --with-mpi=0
--with-debugging=1 --with-blacs=1 --download-zlib,--download-p4est
--download-hdf5=1 --download-triangle=1 --with-single-library=0
--with-large-file-io=1 --with-shared-libraries=0 -CFLAGS=" -g -O0"
-CXXFLAGS=" -g -O0" -FFLAGS=" -g -O0" PETSC_ARCH=nompi_gcc_hdf5
[0]PETSC ERROR: #1 DMGetCoordinatesLocalNoncollective() at
/home/devel/src_linux/petsc-3.18.0/src/dm/interface/dmcoordinates.c:621
[0]PETSC ERROR: #2 DMPlexGetCellCoordinates() at
/home/devel/src_linux/petsc-3.18.0/src/dm/impls/plex/plexgeometry.c:1291
[0]PETSC ERROR: #3 main() at
/home/jobic/projet/fe-utils/petsc/fetools/src/ex_closure_petsc.c:86
[0]PETSC ERROR: PETSc Option Table entries:
[0]PETSC ERROR: -dm_plex_box_faces 2,2
[0]PETSC ERROR: -dm_plex_dim 2
[0]PETSC ERROR: -dm_plex_simplex 0
[0]PETSC ERROR: -petscspace_degree 1
[0]PETSC ERROR: -view_coord
[0]PETSC ERROR: End of Error Message ---send entire
error message to petsc-ma...@mcs.anl.gov--

Maybe i've done something wrong ?


Moreover, i don't quite understand the function DMPlexGetLocalOffsets,
and how to use it with DMGetCoordinatesLocal. It seems that
DMGetCoordinatesLocal do not have the coords of the dofs, but only the
nodes defining the geometry.

I've made some small modifications of ex8.c, but i still have an error :
[0]PETSC ERROR: - Error Message
--
[0]PETSC ERROR: Invalid argument
[0]PETSC ERROR: Wrong type of object: Parameter # 1
[0]PETSC ERROR: WARNING! There are option(s) set that were not used!
Could be the program crashed before they were used or a spelling
mistake, etc!
[0]PETSC ERROR: Option left: name:-sol value: vtk:sol.vtu
[0]PETSC ERROR: See https://petsc.org/release/faq/
 for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.18.2, unknown
[0]PETSC ERROR:
/home/jobic/projet/fe-utils/petsc/fetools/cmake-build-debug/view_coords
on a  named luke by jobic Mon Dec 12 23:51:05 2022
[0]PETSC ERROR: Configure options
--prefix=/local/lib/petsc/3.18/p3/gcc/nompi_hdf5 --with-mpi=0
--with-debugging=1 --with-blacs=1 --download-zlib,--download-p4est
--download-hdf5=1 --download-triangle=1 --with-single-library=0
--with-large-file-io=1 --with-shared-libraries=0 -CFLAGS=" -g -O0"
-CXXFLAGS=" -g -O0" -FFLAGS=" -g -O0" PETSC_ARCH=nompi_gcc_hdf5
[0]PETSC ERROR: #1 PetscFEGetHeightSubspace() at
/home/devel/src_linux/petsc-3.18.0/src/dm/dt/fe/interface/fe.c:1692
[0]PETSC ERROR: #2 DMPlexGetLocalOffsets() at
/home/devel/src_linux/petsc-3.18.0/src/dm/impls/plex/plexceed.c:98
[0]PETSC ERROR: #3 ViewOffsets() at
/home/jobic/projet/fe-utils/petsc/fetools/src/view_coords.c:28
[0]PETSC ERROR: #4 main() at
/home/jobic/projet/fe-utils/petsc/fetools/src/view_coords.c:99
[0]PETSC ERROR: PETSc Option Table entries:
[0]PETSC ERROR: -heat_petscspace_degree 2
[0]PETSC ERROR: -sol vtk:sol.vtu
[0]PETSC ERROR: End of Error Message ---send entire
error message to petsc-ma...@mcs.anl.gov--


Is dm/impls/plex/tutorials/ex8.c a good example for viewing the coords
of the dofs of a DMPlex ?


Thanks,

Yann



--
What most experimenters take for granted before they begin their 
experiments is