Hi Matt, Thanks for your reply. We have got to a consensus. For such a problem, I have sought the mail list for helps and found a great many misleading answers. I think some of answers may work for the DMDA. Now I have overcome this by creating a PetscSF and PetscSFBcast, and the procedures are described as follow.
1. Using DMGLobaltoLocal to update the local vector(the ghost cell mighb get the wrong result)2. Creating a PetscSF which matches the local ghost cell and the global donor cell across the processes3. Using PetscSFBCast to update the local ghost cell Now, my code can work. Can you give me a more scalable procedure? Thanks. On Thu, 2019-01-17 at 06:43 -0500, Matthew Knepley wrote: > On Thu, Jan 17, 2019 at 3:34 AM leejearl via petsc-users < > [email protected]> wrote: > > Hi all Petscer, > > > > I have ask helps for some questions. Thanks for the replies > > > > from developer. I have known more about the petsc. I have also > > sought > > > > helps in the mail lists, and I find that there are many subjects > > > > focused on such a problem. Some subjects are listed as follow > > > > > > > > > > > > https://lists.mcs.anl.gov/pipermail/petsc-users/2019-January/037425.html > > > > , > > > > > > > > > > > > https://lists.mcs.anl.gov/mailman/htdig/petsc-users/2015-January/024068.html > > > > , > > > > > > > > > > > > https://lists.mcs.anl.gov/mailman/htdig/petsc-users/2008-November/003633.html > > > > , > > > > > > > > > > > > https://lists.mcs.anl.gov/pipermail/petsc-users/2019-January/037436.html > > > > > > > > ...... > > > > > > > > The problem can be summarized as > > > > 1. Getting the value of vector from other processor > > > > 2. How to set the value of ghost cell, the value of the ghost cell > > > > might be as same as a very cell in other processor. > > > > > > > > I think the above problem is very popular. I suffer such a problem > > when > > > > I want to treat the periodic boundary for the FVM method. > > Ah, I think I am now seeing the problem. In DMPlex, we do not > implementperiodicity of the mesh by putting in extra communication. > We make aperiodic mesh directly. For example, in 1D what you describe > would mesha line segment, and then communicate the values from one > end to the other.Instead, in Plex we just mesh a circle. > Thanks, > Matt > > After the dm > > > > object is distributed, the donor cell of bound might be in other > > > > processor. Since the donor cell must be matched as a pair > > correctly, > > > > using the routines DMGlobaltoLocal and DMLocaltoGlobal can not get > > the > > > > expected results. > > > > > > > > In fact, such a problem is not very difficult for a mpi program. > > But we > > > > are coding with the petsc, we always think that whether we can > > > > implement more easily. > > > > > > > > Whether our developer can create a demo for us to following. I > > think > > > > such a demo is very useful for user. > > > > > > > > Thanks > > > > > > > > leejeal > > > > > > > > > > > > > > > > > > > > > > > >
