Dear all,
I adapted step-29 to run in parallel, similar as step-40. With 1 MPI rank
it works, however, when using more than 1 rank I get a running error
(attached is the full output)
*An error occurred in line <74> of file
</var/folders/8z/hlb6vc015qjggytkxn84m6_c0000gn/T/heltai/spack-stage/spack-stage-dealii-9.3.0-zy7k3uwnakcqjvrajvacy5l4jrl7eaex/spack-src/source/dofs/dof_tools_sparsity.cc>
in function*
* void dealii::DoFTools::make_sparsity_pattern(const DoFHandler<dim,
spacedim> &, SparsityPatternType &, const AffineConstraints<number> &,
const bool, const types::subdomain_id) [dim = 2, spacedim = 2,
SparsityPatternType = dealii::DynamicSparsityPattern, number = double]*
*The violated condition was: *
* sparsity.n_rows() == n_dofs*
*Additional information: *
* Dimension 26752 not equal to 51842*
I found out that the problem is in the setp_system() function. The
problematic line is in red. Could you please help me to figure out the
issue?
*template <int dim>*
* void UltrasoundProblem<dim>::setup_system()** {*
* deallog << "Setting up system... ";*
* deallog << "OK1... ";*
* dof_handler.distribute_dofs(fe);*
* locally_owned_dofs = dof_handler.locally_owned_dofs();*
* DoFTools::extract_locally_relevant_dofs(dof_handler,
locally_relevant_dofs);*
* locally_relevant_solution.reinit(locally_owned_dofs, *
*locally_relevant_dofs,**mpi_communicator);*
* system_rhs.reinit(locally_owned_dofs, mpi_communicator);*
* constraints.clear();*
* constraints.reinit(locally_relevant_dofs);*
* DoFTools::make_hanging_node_constraints(dof_handler, constraints);*
* VectorTools::interpolate_boundary_values(dof_handler,** 1,*
*DirichletBoundaryValues<dim>(),**constraints);*
* constraints.close();*
* DynamicSparsityPattern dsp(locally_relevant_dofs.n_elements(),
locally_relevant_dofs.n_elements());*
* DoFTools::make_sparsity_pattern(dof_handler,
dsp,constraints,false);//THIS*
*
SparsityTools::distribute_sparsity_pattern(dsp,dof_handler.locally_owned_dofs(),mpi_communicator,locally_relevant_dofs);*
* system_matrix.reinit(locally_owned_dofs,locally_owned_dofs, dsp,
mpi_communicator);*
* }*
Thank you very much
H.
--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see
https://groups.google.com/d/forum/dealii?hl=en
---
You received this message because you are subscribed to the Google Groups
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To view this discussion on the web visit
https://groups.google.com/d/msgid/dealii/5808c00f-8c0a-47e2-9dcd-43dc1ec8068cn%40googlegroups.com.
{\rtf1\ansi\ansicpg1252\cocoartf2513
\cocoatextscaling0\cocoaplatform0{\fonttbl\f0\fnil\fcharset0 Menlo-Regular;}
{\colortbl;\red255\green255\blue255;\red0\green0\blue0;\red255\green255\blue255;}
{\*\expandedcolortbl;;\csgray\c0;\cspthree\c100000\c99998\c100000;}
\paperw11900\paperh16840\margl1440\margr1440\vieww10800\viewh8400\viewkind0
\pard\tx560\tx1120\tx1680\tx2240\tx2800\tx3360\tx3920\tx4480\tx5040\tx5600\tx6160\tx6720\pardirnatural\partightenfactor0
\f0\fs22 \cf2 \cb3 \CocoaLigature0 bash-3.2$ mpirun -np 2 ./step-29\
DEAL::Running with Trilinos on 2 MPI rank(s)...\
DEAL::Running with Trilinos on 2 MPI rank(s)...\
\
--------------------------------------------------------\
An error occurred in line <74> of file </var/folders/8z/hlb6vc015qjggytkxn84m6_c0000gn/T/heltai/spack-stage/spack-stage-dealii-9.3.0-zy7k3uwnakcqjvrajvacy5l4jrl7eaex/spack-src/source/dofs/dof_tools_sparsity.cc> in function\
void dealii::DoFTools::make_sparsity_pattern(const DoFHandler<dim, spacedim> &, SparsityPatternType &, const AffineConstraints<number> &, const bool, const types::subdomain_id) [dim = 2, spacedim = 2, SparsityPatternType = dealii::DynamicSparsityPattern, number = double]\
The violated condition was: \
sparsity.n_rows() == n_dofs\
Additional information: \
Dimension 26752 not equal to 51842.\
\
Stacktrace:\
-----------\
#0 2 libdeal_II.g.9.3.0.dylib 0x000000011657f83f _ZN6dealii8DoFTools21make_sparsity_patternILi2ELi2ENS_22DynamicSparsityPatternEdEEvRKNS_10DoFHandlerIXT_EXT0_EEERT1_RKNS_17AffineConstraintsIT2_EEbj + 719: 2 libdeal_II.g.9.3.0.dylib 0x000000011657f83f _ZN6dealii8DoFTools21make_sparsity_patternILi2ELi2ENS_22DynamicSparsityPatternEdEEvRKNS_10DoFHandlerIXT_EXT0_EEERT1_RKNS_17AffineConstraintsIT2_EEbj \
#1 3 step-29 0x00000001032168a8 _ZN6Step2917UltrasoundProblemILi2EE12setup_systemEv + 456: 3 step-29 0x00000001032168a8 _ZN6Step2917UltrasoundProblemILi2EE12setup_systemEv \
#2 4 step-29 0x0000000103201787 _ZN6Step2917UltrasoundProblemILi2EE3runEv + 167: 4 step-29 0x0000000103201787 _ZN6Step2917UltrasoundProblemILi2EE3runEv \
#3 5 step-29 0x00000001032014b3 main + 211: 5 step-29 0x00000001032014b3 main \
#4 6 libdyld.dylib 0x00007fff67eb1cc9 start + 1: 6 libdyld.dylib 0x00007fff67eb1cc9 start \
--------------------------------------------------------\
\
Calling MPI_Abort now.\
To break execution in a GDB session, execute 'break MPI_Abort' before running. You can also put the following into your ~/.gdbinit:\
set breakpoint pending on\
break MPI_Abort\
set breakpoint pending auto\
\
--------------------------------------------------------\
An error occurred in line <74> of file </var/folders/8z/hlb6vc015qjggytkxn84m6_c0000gn/T/heltai/spack-stage/spack-stage-dealii-9.3.0-zy7k3uwnakcqjvrajvacy5l4jrl7eaex/spack-src/source/dofs/dof_tools_sparsity.cc> in function\
void dealii::DoFTools::make_sparsity_pattern(const DoFHandler<dim, spacedim> &, SparsityPatternType &, const AffineConstraints<number> &, const bool, const types::subdomain_id) [dim = 2, spacedim = 2, SparsityPatternType = dealii::DynamicSparsityPattern, number = double]\
The violated condition was: \
sparsity.n_rows() == n_dofs\
Additional information: \
Dimension 26632 not equal to 51842.\
\
Stacktrace:\
-----------\
#0 2 libdeal_II.g.9.3.0.dylib 0x000000011762083f _ZN6dealii8DoFTools21make_sparsity_patternILi2ELi2ENS_22DynamicSparsityPatternEdEEvRKNS_10DoFHandlerIXT_EXT0_EEERT1_RKNS_17AffineConstraintsIT2_EEbj + 719: 2 libdeal_II.g.9.3.0.dylib 0x000000011762083f _ZN6dealii8DoFTools21make_sparsity_patternILi2ELi2ENS_22DynamicSparsityPatternEdEEvRKNS_10DoFHandlerIXT_EXT0_EEERT1_RKNS_17AffineConstraintsIT2_EEbj \
#1 3 step-29 0x00000001058ea8a8 _ZN6Step2917UltrasoundProblemILi2EE12setup_systemEv + 456: 3 step-29 0x00000001058ea8a8 _ZN6Step2917UltrasoundProblemILi2EE12setup_systemEv \
#2 4 step-29 0x00000001058d5787 _ZN6Step2917UltrasoundProblemILi2EE3runEv + 167: 4 step-29 0x00000001058d5787 _ZN6Step2917UltrasoundProblemILi2EE3runEv \
#3 5 step-29 0x00000001058d54b3 main + 211: 5 step-29 0x00000001058d54b3 main \
#4 6 libdyld.dylib 0x00007fff67eb1cc9 start + 1: 6 libdyld.dylib 0x00007fff67eb1cc9 start \
--------------------------------------------------------\
\
Calling MPI_Abort now.\
To break execution in a GDB session, execute 'break MPI_Abort' before running. You can also put the following into your ~/.gdbinit:\
set breakpoint pending on\
break MPI_Abort\
set breakpoint pending auto\
--------------------------------------------------------------------------\
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD\
with errorcode 255.\
\
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.\
You may or may not see output from other processes, depending on\
exactly when Open MPI kills them.\
--------------------------------------------------------------------------\
[mbp-de-hermes.clients.wireless.dtu.dk:29650] PMIX ERROR: UNREACHABLE in file server/pmix_server.c at line 2193\
[mbp-de-hermes.clients.wireless.dtu.dk:29650] 1 more process has sent help message help-mpi-api.txt / mpi-abort\
[mbp-de-hermes.clients.wireless.dtu.dk:29650] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages}