On 09/05/2017 07:29 AM, Dante De Santis wrote:

This is working for me, except that the solution on the boundary nodes of each partition is always zero while
for the interior nodes is OK.

I am using a parallel distributed triangulation and PETSc vector and here is the extracted piece of the code:

    /*
     Save the solution
     where sol_u is a PETScWrappers::MPI::Vector
    */
parallel::distributed::SolutionTransfer<dim, PETScWrappers::MPI::Vector>system_trans (dof_handler);
   system_trans.prepare_serialization (sol_u);
   triangulation.save ( filename.c_str() );

    //Reload the solution
parallel::distributed::SolutionTransfer<dim, PETScWrappers::MPI::Vector>system_trans (dof_handler);
   system_trans.deserialize(sol_u);

I know that there must be some issues with distributed vector/dofs but I cannot figure it out.

I can't right away see what the issue may be, but to help debug this:
* Change your code in such a way that just before serialization, you set all entries of the solution vector to one. This way, you know exactly what you have. What you get back should match this exactly. * Check the following: Is sol_u a vector that has ghost elements before serialization? Is sol_u a vector that has ghost elements after serialization? Because if these vectors don't have ghost elements, then what you see visually may not be what you actually have.

As always, you should run your program in debug mode to make sure as many bugs are found automatically as possible.

Best
 W.

--
------------------------------------------------------------------------
Wolfgang Bangerth          email:                 [email protected]
                           www: http://www.math.colostate.edu/~bangerth/

--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- You received this message because you are subscribed to the Google Groups "deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to