Thank you, Praveen!

So, what should I do? I cannot use that kind of mesh with MPI? Could you 
please suggest other solution?

Best,
Ki-Tae

On Wednesday, November 7, 2018 at 12:26:20 PM UTC-5, Praveen C wrote:
>
> Running your code in debug mode gives an error, says error is in the 
> library not your code
>
> $ mpirun -np 4 ./step-1
> Cycle 0:
>    Number of active cells:       24
>    Number of degrees of freedom: 889
> =====================================
>
> Cycle 1:
>    Number of active cells:       192
>    Number of degrees of freedom: 6097
> =====================================
>
>
> --------------------------------------------------------
> An error occurred in line <3610> of file 
> </Users/praveen/Applications/deal.II/git/source/dofs/dof_handler_policy.cc> 
> in function
>     auto dealii::internal::DoFHandlerImplementation::Policy::(anonymous 
> namespace)::communicate_dof_indices_on_marked_cells(const 
> dealii::DoFHandler<3, 3> &, const std::map<unsigned int, 
> std::set<dealii::types::subdomain_id> > &, const 
> std::vector<dealii::types::global_dof_index> &, const 
> std::vector<dealii::types::global_dof_index> &)::(anonymous 
> class)::operator()(const typename DoFHandlerType::active_cell_iterator &, 
> const std::vector<types::global_dof_index> &) const
> The violated condition was: 
>     (received_dof_indices[i] == numbers::invalid_dof_index) || 
> (received_dof_indices[i] == local_dof_indices[i])
> Additional information: 
>     This exception -- which is used in many places in the library -- 
> usually indicates that some condition which the author of the code thought 
> must be satisfied at a certain point in an algorithm, is not fulfilled. An 
> example would be that the first part of an algorithm sorts elements of an 
> array in ascending order, and a second part of the algorithm later 
> encounters an element that is not larger than the previous one.
>
> There is usually not very much you can do if you encounter such an 
> exception since it indicates an error in deal.II, not in your own program. 
> Try to come up with the smallest possible program that still demonstrates 
> the error and contact the deal.II mailing lists with it to obtain help.
>
> Stacktrace:
> -----------
> #0  2   libdeal_II.g.9.1.0-pre.dylib        0x000000010f0b80a3 
> _ZZN6dealii8internal24DoFHandlerImplementation6Policy12_GLOBAL__N_139communicate_dof_indices_on_marked_cellsINS_10DoFHandlerILi3ELi3EEEEEvRKT_RKNSt3__13mapIjNSA_3setIjNSA_4lessIjEENSA_9allocatorIjEEEESE_NSF_INSA_4pairIKjSH_EEEEEERKNSA_6vectorIjSG_EESS_ENKUlRKNS_18TriaActiveIteratorINS_15DoFCellAccessorIS6_Lb0EEEEESS_E_clESY_SS_
>  
> + 419: 2   libdeal_II.g.9.1.0-pre.dylib        0x000000010f0b80a3 
> _ZZN6dealii8internal24DoFHandlerImplementation6Policy12_GLOBAL__N_139communicate_dof_indices_on_marked_cellsINS_10DoFHandlerILi3ELi3EEEEEvRKT_RKNSt3__13mapIjNSA_3setIjNSA_4lessIjEENSA_9allocatorIjEEEESE_NSF_INSA_4pairIKjSH_EEEEEERKNSA_6vectorIjSG_EESS_ENKUlRKNS_18TriaActiveIteratorINS_15DoFCellAccessorIS6_Lb0EEEEESS_E_clESY_SS_
>  
> #1  3   libdeal_II.g.9.1.0-pre.dylib        0x000000010f0b6bed 
> _ZN6dealii9GridTools28exchange_cell_data_to_ghostsINSt3__16vectorIjNS2_9allocatorIjEEEENS_10DoFHandlerILi3ELi3EEEEEvRKT0_RKNS2_8functionIFN5boost8optionalIT_EERKNS9_20active_cell_iteratorEEEERKNSC_IFvSJ_RKSF_EEE
>  
> + 1725: 3   libdeal_II.g.9.1.0-pre.dylib        0x000000010f0b6bed 
> _ZN6dealii9GridTools28exchange_cell_data_to_ghostsINSt3__16vectorIjNS2_9allocatorIjEEEENS_10DoFHandlerILi3ELi3EEEEEvRKT0_RKNS2_8functionIFN5boost8optionalIT_EERKNS9_20active_cell_iteratorEEEERKNSC_IFvSJ_RKSF_EEE
>  
> #2  4   libdeal_II.g.9.1.0-pre.dylib        0x000000010f050c97 
> _ZN6dealii8internal24DoFHandlerImplementation6Policy12_GLOBAL__N_139communicate_dof_indices_on_marked_cellsINS_10DoFHandlerILi3ELi3EEEEEvRKT_RKNSt3__13mapIjNSA_3setIjNSA_4lessIjEENSA_9allocatorIjEEEESE_NSF_INSA_4pairIKjSH_EEEEEERKNSA_6vectorIjSG_EESS_
>  
> + 71: 4   libdeal_II.g.9.1.0-pre.dylib        0x000000010f050c97 
> _ZN6dealii8internal24DoFHandlerImplementation6Policy12_GLOBAL__N_139communicate_dof_indices_on_marked_cellsINS_10DoFHandlerILi3ELi3EEEEEvRKT_RKNSt3__13mapIjNSA_3setIjNSA_4lessIjEENSA_9allocatorIjEEEESE_NSF_INSA_4pairIKjSH_EEEEEERKNSA_6vectorIjSG_EESS_
>  
> #3  5   libdeal_II.g.9.1.0-pre.dylib        0x000000010f0504f8 
> _ZNK6dealii8internal24DoFHandlerImplementation6Policy19ParallelDistributedINS_10DoFHandlerILi3ELi3EEEE15distribute_dofsEv
>  
> + 1640: 5   libdeal_II.g.9.1.0-pre.dylib        0x000000010f0504f8 
> _ZNK6dealii8internal24DoFHandlerImplementation6Policy19ParallelDistributedINS_10DoFHandlerILi3ELi3EEEE15distribute_dofsEv
>  
> #4  6   libdeal_II.g.9.1.0-pre.dylib        0x000000010ecd9e73 
> _ZN6dealii10DoFHandlerILi3ELi3EE15distribute_dofsERKNS_13FiniteElementILi3ELi3EEE
>  
> + 211: 6   libdeal_II.g.9.1.0-pre.dylib        0x000000010ecd9e73 
> _ZN6dealii10DoFHandlerILi3ELi3EE15distribute_dofsERKNS_13FiniteElementILi3ELi3EEE
>  
> #5  7   step-1                              0x000000010145c2b1 
> _ZN7Problem3runEv + 113: 7   step-1                              
> 0x000000010145c2b1 _ZN7Problem3runEv 
> #6  8   step-1                              0x000000010145c552 main + 98: 
> 8   step-1                              0x000000010145c552 main 
> #7  9   libdyld.dylib                       0x00007fff6795f015 start + 1: 
> 9   libdyld.dylib                       0x00007fff6795f015 start 
> --------------------------------------------------------
>
> Calling MPI_Abort now.
> To break execution in a GDB session, execute 'break MPI_Abort' before 
> running. You can also put the following into your ~/.gdbinit:
>   set breakpoint pending on
>   break MPI_Abort
>   set breakpoint pending auto
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 255.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
>
> Best
> praveen
>
> On 07-Nov-2018, at 10:42 PM, Ki-Tae Kim <qls...@gmail.com <javascript:>> 
> wrote:
>
> #include <iostream>
>
> #include <deal.II/distributed/tria.h>
> #include <deal.II/dofs/dof_handler.h>
> #include <deal.II/fe/fe_q.h>
>
> #include <deal.II/base/mpi.h>
> #include <deal.II/base/utilities.h>
> #include <deal.II/base/conditional_ostream.h>
>
> #include <deal.II/base/index_set.h>
>
> using namespace dealii;
>
> class Problem
> {
> public:
>     Problem();
>     ~Problem();
>
>     void run();
>
> private:
>     MPI_Comm mpi_communicator;
>
>     parallel::distributed::Triangulation<3> triangulation;
>
>     DoFHandler<3> dof_handler;
>     FE_Q<3> fe;
>
>     ConditionalOStream pcout;
>
>     void make_domain_quater_crack();
> };
>
> Problem::Problem() :
> mpi_communicator (MPI_COMM_WORLD),
> triangulation (mpi_communicator,
>                typename Triangulation<3>::MeshSmoothing
>                        (Triangulation<3>::smoothing_on_refinement |
>                         Triangulation<3>::smoothing_on_coarsening)),
> dof_handler(triangulation),
> fe(3),
> pcout (std::cout,
>        (Utilities::MPI::this_mpi_process(mpi_communicator) == 0))
> {}
>
> Problem::~Problem()
> {
>     dof_handler.clear();
> }
>
> void Problem::make_domain_quater_crack() {
>     std::vector<Point<3>> vertices ={
>             {0.0,0.0,0.0},
>             {1.0,0.0,0.0},
>             {2.0,0.0,0.0},
>             {0.0,0.0,1.0},
>             {1.0,0.0,1.0},
>             {2.0,0.0,1.0},
>             {0.0,1.0,0.0},
>             {0.0,2.0,0.0},
>             {0.0,1.0,1.0},
>             {0.0,2.0,1.0},
>             {1.0,1.0,0.0},
>             {2.0,2.0,0.0},
>             {1.0,1.0,1.0},
>             {2.0,2.0,1.0},
>     };
>
>     const std::vector<std::array<int, GeometryInfo<3>::vertices_per_cell>>
>     cell_vertices = {
>             {0,1,6,10,3,4,8,12},
>             {1,2,10,11,4,5,12,13},
>             {10,11,6,7,12,13,8,9},
>     };
>
>     const unsigned int n_cells = cell_vertices.size();
>
>     std::vector<CellData<3>> cells(n_cells, CellData<3>());
>     for (unsigned int i = 0; i < n_cells; ++i)
>     {
>         for (unsigned int j = 0; j < GeometryInfo<3>::vertices_per_cell;
>              ++j)
>             cells[i].vertices[j] = cell_vertices[i][j];
>         cells[i].material_id = 0;
>     }
>
>     triangulation.create_triangulation(vertices, cells, SubCellData());
> }
>
> void Problem::run() {
>     make_domain_quater_crack();
>
>     unsigned int max_cycle=4;
>     for (unsigned int cycle(0); cycle<max_cycle; ++cycle){
>         triangulation.refine_global(1);
>
>         dof_handler.distribute_dofs (fe);
>
>         if (Utilities::MPI::this_mpi_process(mpi_communicator) == 0) {
>             const unsigned int n_active_cells = 
> triangulation.n_global_active_cells();
>             const unsigned int n_dofs = dof_handler.n_dofs();
>             pcout << "Cycle " << cycle << ':'
>                   << std::endl
>                   << "   Number of active cells:       "
>                   << n_active_cells
>                   << std::endl
>                   << "   Number of degrees of freedom: "
>                   << n_dofs
>                   << std::endl;
>         }
>         pcout << "=====================================" << std::endl;
>         pcout << std::endl;
>     }
> }
>
> int main(int argc, char *argv[])
> {
>     try
>     {
>         Utilities::MPI::MPI_InitFinalize mpi_initialization(argc, argv, 1);
>         Problem test;
>         test.run ();
>     }
>     catch (std::exception &exc)
>     {
>         std::cerr << std::endl << std::endl
>                   << "----------------------------------------------------"
>                   << std::endl;
>         std::cerr << "Exception on processing: " << std::endl
>                   << exc.what() << std::endl
>                   << "Aborting!" << std::endl
>                   << "----------------------------------------------------"
>                   << std::endl;
>         return 1;
>     }
>     catch (...)
>     {
>         std::cerr << std::endl << std::endl
>                   << "----------------------------------------------------"
>                   << std::endl;
>         std::cerr << "Unknown exception!" << std::endl
>                   << "Aborting!" << std::endl
>                   << "----------------------------------------------------"
>                   << std::endl;
>         return 1;
>     }
>     return 0;
> }
>
>
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to