Re: [deal.II] Re: Transferring solutions in distributed computing

2016-07-21 Thread Daniel Arndt
Junchao, It seems that the documentation is outdated for this piece of information. In fact, neither PETScWrapper::MPI::Vector nor TrilinosWrappers::MPI::Vector does have update_ghost_values. What you should do is exactly what is done in the few lines of step-42 you referenced. "solution =

Re: [deal.II] Re: Transferring solutions in distributed computing

2016-07-21 Thread Junchao Zhang
Daniel, The link you provides is very helpful. Thanks. In the code, I see solution_transfer.interpolate(distributed_solution); constraints_hanging_nodes.distribute(distributed_solution); solution = distributed_solution; I am confused by the postprocessing. I think distributed_solution does not

[deal.II] Decoupling FECollection and QCollection

2016-07-21 Thread Deepak Gupta
Dear All, I am trying to use hp::FECollection and hp::QCollection in my work. For QCollection, I can read as follows from the available online document: "*The quadrature rules have to be added in the same order as for the FECollection

[deal.II] Re: Transferring solutions in distributed computing

2016-07-21 Thread Daniel Arndt
Junchao, You want to use parallel::distributed::SolutionTransfer instead if you are on a parallel::distributed::Triangulation Executing $ grep -r "parallel::distributed::SolutionTransfer" . in the examples folder, tells me that this object is used in step-32, step-42 and step-48. Have for