Re: [deal.II] PETScWrappers::SparseDirectMUMPS use in parallel version of step-22

2017-09-25 Thread Anna Avdeeva
Dear Timo, Sorry, I did not mean to write to your personal e-mail. Just pressed the wrong reply button by mistake. Thank you very much for your reply. I will check the way how I set mpi_communicator now. Hopefully, will find the problem. I have changed step-40 couple of days ago myself to

Re: [deal.II] AztecOO::Iterate error code -3: loss of precision for offset values in step-15

2017-09-25 Thread Wolfgang Bangerth
On 09/25/2017 01:25 PM, 'Maxi Miller' via deal.II User Group wrote: I do not understand the reason why that happens. Is it a mathematical problem? Or rather a problem in my code? Afaik the gradients should not depend on the offset, thus I do not know where to look for the problem here. Did

[deal.II] AztecOO::Iterate error code -3: loss of precision for offset values in step-15

2017-09-25 Thread 'Maxi Miller' via deal.II User Group
I rewrote example 15 for a MPI-environment using Trilinos, and solve it with IndexSet solution_relevant_partitioning(dof_handler.n_dofs()); DoFTools::extract_locally_relevant_dofs(dof_handler, solution_relevant_partitioning); LinearAlgebraTrilinos::MPI::Vector

Re: [deal.II] Parallel implementation

2017-09-25 Thread Wolfgang Bangerth
Anna, to compute values of the electric field at the receivers I follow the strategy of ASPECT code as you suggested To do this I sum the current_point_values across processors and divide by the number of processors that contain point p as following // Reduce all collected values into local

Re: [deal.II] PETScWrappers::SparseDirectMUMPS use in parallel version of step-22

2017-09-25 Thread Timo Heister
and from your email I got off-list (please try to use the mailinglist): > [0]PETSC ERROR: #1 PetscCommDuplicate() line 137 in > /home/anna/petsc-3.6.4/src/sys/objects/tagm.c > An error occurred in line <724> of file > in function > void dealii::PETScWrappers::SparseDirectMUMPS::solve(const >