I am trying to assemble my residual/RHS by using mesh_loop:
evaluation_point = present_solution;
extension_vector = newton_update;
extension_vector *= alpha;
evaluation_point += extension_vector;
residual.reinit(locally_owned_dofs, mpi_communicator);
residual_result.reinit(locally_owned_dofs, mpi_communicator);
residual = 0;
residual_result = 0;
const QGauss<dim> quadrature_formula(fe.degree + gauss_size);
using CellFilter = FilteredIterator<typename DoFHandler<2>::
active_cell_iterator>;
std::cout << "Looping over all cells\n";
MeshWorker::mesh_loop(dof_handler.begin_active(),
dof_handler.end(),
std::bind(&MinimalSurfaceProblem<dim, EQU,
ad_type_code>::local_assemble_residual,
this,
std::placeholders::_1,
std::placeholders::_2,
std::placeholders::_3),
std::bind(&MinimalSurfaceProblem<dim, EQU,
ad_type_code>::copy_local_to_global_residual,
this,
std::placeholders::_1),
ResidualScratchData(fe, quadrature_formula,
update_values | update_gradients | update_quadrature_points |
update_JxW_values, I_val, I_val_old),
ResidualCopyData());
std::cout << "Compressing residual\n";
residual.compress(VectorOperation::add);
std::cout << "Residual compressed\n";
boundary_constraints.set_zero(residual);
std::cout << "Boundary constraints set to zero\n";
residual_result = residual;
with
copy_local_to_global_residual
defined as
void MinimalSurfaceProblem<dim, EQU, ad_type_code>::
copy_local_to_global_residual(const ResidualCopyData &data)
{
boundary_constraints.distribute_local_to_global(data.local_residual,
data.
local_dof_indices,
residual);
}
Unfortunately, my program fails after some iterations with the error
--------------------------------------------------------
An error occurred in line <624> of file <~/Downloads/git-files/dealii/source
/lac/trilinos_vector.cc> in function
void dealii::TrilinosWrappers::MPI::Vector::compress(dealii::
VectorOperation::values)
The violated condition was:
result.max - result.min < 1e-5
Additional information:
Not all processors agree whether the last operation on this vector was
an addition or a set operation. This will prevent the compress() operation
from succeeding.
Stacktrace:
-----------
#0 /opt/dealii/lib/libdeal_II.g.so.9.2.0-pre:
dealii::TrilinosWrappers::MPI::Vector::compress(dealii::VectorOperation::values)
#1 bin/TTM-equation: TTM_Calculation::MinimalSurfaceProblem<2,
Silicon_Verburg::physics_equations,
(dealii::Differentiation::AD::NumberTypes)3>::compute_residual(double,
dealii::TrilinosWrappers::MPI::Vector&)
#2 bin/TTM-equation: TTM_Calculation::MinimalSurfaceProblem<2,
Silicon_Verburg::physics_equations,
(dealii::Differentiation::AD::NumberTypes)3>::recalculate_step_length()
#3 bin/TTM-equation: TTM_Calculation::MinimalSurfaceProblem<2,
Silicon_Verburg::physics_equations,
(dealii::Differentiation::AD::NumberTypes)3>::update_newton_solution(double
const&, double const&)
#4 bin/TTM-equation: TTM_Calculation::MinimalSurfaceProblem<2,
Silicon_Verburg::physics_equations,
(dealii::Differentiation::AD::NumberTypes)3>::run()
#5 bin/TTM-equation: main
--------------------------------------------------------
Based on the output I assume that the distribute_local_to_global-function
is the problem (when running with 8 threads, I get 8 lines with "Looping
over all cells", but only 7 with "Compressing residual". The last operation
(after doing a reinit() and setting it to 0) I apply onto the vector
residual is distribute_local_to_global(), thus this is the current suspect.
Or are there other other possibilities?
Note: I had a similar error earlier, but that time I forgot the final
compress()-call before calling set_zero().
--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see
https://groups.google.com/d/forum/dealii?hl=en
---
You received this message because you are subscribed to the Google Groups
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To view this discussion on the web visit
https://groups.google.com/d/msgid/dealii/8301da2b-48e4-4600-8d37-a7a988d38a98%40googlegroups.com.