To avoid confusions here, Berend seems to be specifically demanding XDMF 
(PETSC_VIEWER_HDF5_XDMF). The stuff we are now working on is parallel 
checkpointing in our own HDF5 format (PETSC_VIEWER_HDF5_PETSC), I will make a 
series of MRs on this topic in the following days.

For XDMF, we are specifically missing the ability to write/load DMLabels 
properly. XDMF uses specific cell-local numbering for faces for specification 
of face sets, and face-local numbering for specification of edge sets, which is 
not great wrt DMPlex design. And ParaView doesn't show any of these properly so 
it's hard to debug. Matt, we should talk about this soon.

Berend, for now, could you just load the mesh initially from XDMF and then use 
our PETSC_VIEWER_HDF5_PETSC format for subsequent saving/loading?

Thanks,

Vaclav

On 17 Sep 2021, at 15:46, Lawrence Mitchell <we...@gmx.li<mailto:we...@gmx.li>> 
wrote:

Hi Berend,

On 14 Sep 2021, at 12:23, Matthew Knepley 
<knep...@gmail.com<mailto:knep...@gmail.com>> wrote:

On Tue, Sep 14, 2021 at 5:15 AM Berend van Wachem 
<berend.vanwac...@ovgu.de<mailto:berend.vanwac...@ovgu.de>> wrote:
Dear PETSc-team,

We are trying to save and load distributed DMPlex and its associated
physical fields (created with DMCreateGlobalVector)  (Uvelocity,
VVelocity,  ...) in HDF5_XDMF format. To achieve this, we do the following:

1) save in the same xdmf.h5 file:
DMView( DM         , H5_XDMF_Viewer );
VecView( UVelocity, H5_XDMF_Viewer );

2) load the dm:
DMPlexCreateFromfile(PETSC_COMM_WORLD, Filename, PETSC_TRUE, DM);

3) load the physical field:
VecLoad( UVelocity, H5_XDMF_Viewer );

There are no errors in the execution, but the loaded DM is distributed
differently to the original one, which results in the incorrect
placement of the values of the physical fields (UVelocity etc.) in the
domain.

This approach is used to restart the simulation with the last saved DM.
Is there something we are missing, or there exists alternative routes to
this goal? Can we somehow get the IS of the redistribution, so we can
re-distribute the vector data as well?

Many thanks, best regards,

Hi Berend,

We are in the midst of rewriting this. We want to support saving multiple 
meshes, with fields attached to each,
and preserving the discretization (section) information, and allowing us to 
load up on a different number of
processes. We plan to be done by October. Vaclav and I are doing this in 
collaboration with Koki Sagiyama,
David Ham, and Lawrence Mitchell from the Firedrake team.

The core load/save cycle functionality is now in PETSc main. So if you're using 
main rather than a release, you can get access to it now. This section of the 
manual shows an example of how to do things 
https://petsc.org/main/docs/manual/dmplex/#saving-and-loading-data-with-hdf5

Let us know if things aren't clear!

Thanks,

Lawrence

Reply via email to