Vivek,

just remove the line
if (i->first >= range.first && i->first < range.second)
VectorTools::interpolate_boundary_values should only set values for locally
active degrees of freedom anyway.
As long as the values are consistent between different processes, this
should work just fine.

Best,
Daniel


Am Mi., 10. Juli 2019 um 20:02 Uhr schrieb Vivek Kumar <
[email protected]>:

> Hi all,
>
> I had a legacy code were PETScWrappers::MPI::Vectors were used to for
> parallel computing. For very large problems, I was told to move to
> Trilinos. It was fairly easy to do so for most part of the code but I am
> stuck in the implementation of boundary values. Currently the boundary
> values are implemented as follows:
>
> template<int dim>
> void ElasticityEquation<dim>::apply_boundary_conditions(const double time,
> const double velocity)
> {
> // Begin timer
> TimerOutput::Scope t(computing_timer, "Apply Boundary Condition");
>
> boundary_values.clear();
> VectorTools::interpolate_boundary_values
> (dof_handler,2,ZeroFunction<dim>(dim),boundary_values);
>
> VectorTools::interpolate_boundary_values
> (dof_handler,3,BoundaryValues<dim>(time, velocity) , boundary_values);
> // The range of owned dofs
> std::pair<unsigned int, unsigned int> range = solution.local_range();
>
> // Set the boundary conditions
> for (typename std::map<unsigned int, double>::const_iterator i =
> boundary_values.begin(); i != boundary_values.end(); ++i){
> if (i->first >= range.first && i->first < range.second){
> completely_distributed_solution(i->first) = i->second;
> }
> }
>
> completely_distributed_solution.compress(VectorOperation::insert);
>
> hanging_constraints.distribute( completely_distributed_solution);
>
> solution = completely_distributed_solution;
> }
>
>
> This used to work fine with PETSc but not with Trilinos. The issue seems
> to be the fact the Trilinos does not arrange the vector contiguous ranges
> of elements (
> https://www.dealii.org/current/doxygen/deal.II/classTrilinosWrappers_1_1MPI_1_1Vector.html#abcc22742227d63950cb0995223e6ee32)
> . Is there a standard workaround?
>
> Thanks
> Vivek
>
> --
> The deal.II project is located at http://www.dealii.org/
> For mailing list/forum options, see
> https://groups.google.com/d/forum/dealii?hl=en
> ---
> You received this message because you are subscribed to the Google Groups
> "deal.II User Group" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/dealii/73ea3ef9-b5f8-461d-81c0-e43c0b4d8fbb%40googlegroups.com
> <https://groups.google.com/d/msgid/dealii/73ea3ef9-b5f8-461d-81c0-e43c0b4d8fbb%40googlegroups.com?utm_medium=email&utm_source=footer>
> .
> For more options, visit https://groups.google.com/d/optout.
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/CAOYDWbK98dz_690UiyRpPdT77Lo%3D_%3Djb2Rr8ydfr-Nqy6dMBRw%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to