Dear Timo,
Sorry, I did not mean to write to your personal e-mail. Just pressed the
wrong reply button by mistake.
Thank you very much for your reply. I will check the way how I set
mpi_communicator now. Hopefully, will find the problem.
I have changed step-40 couple of days ago myself to
On 09/25/2017 01:25 PM, 'Maxi Miller' via deal.II User Group wrote:
I do not understand the reason why that happens. Is it a mathematical
problem? Or rather a problem in my code? Afaik the gradients should not
depend on the offset, thus I do not know where to look for the problem here.
Did
I rewrote example 15 for a MPI-environment using Trilinos, and solve it with
IndexSet solution_relevant_partitioning(dof_handler.n_dofs());
DoFTools::extract_locally_relevant_dofs(dof_handler,
solution_relevant_partitioning);
LinearAlgebraTrilinos::MPI::Vector
Anna,
to compute values of the electric field at the receivers I follow the strategy
of ASPECT code as you suggested
To do this I sum the current_point_values across processors and divide by the
number of processors that contain point p as following
// Reduce all collected values into local
and from your email I got off-list (please try to use the mailinglist):
> [0]PETSC ERROR: #1 PetscCommDuplicate() line 137 in
> /home/anna/petsc-3.6.4/src/sys/objects/tagm.c
> An error occurred in line <724> of file
> in function
> void dealii::PETScWrappers::SparseDirectMUMPS::solve(const
>