Dear Rajat, Unfortunately your question does not provide sufficient information for anyone to be able to help you. Although the error is in the solver, the source of the error could come from one of several places. Please see this post <https://groups.google.com/forum/#!topic/dealii/GRZMUTLIm2I> for more information on this point. It would be best to provide a minimal working example that replicates the issue. Also, note that you're using quite an old version of deal.II, and the root of the problem may have already been fixed in a later version.
Regards, Jean-Paul On Tuesday, July 12, 2016 at 3:38:54 AM UTC+2, RAJAT ARORA wrote: > > Hello all, > > I am recently encountering an issue with MUMPS solver. > > The error is coming from the dealii::PETScWrappers::SparseDirectMUMPS > class in these lines. > > > dealii::PETScWrappers::SparseDirectMUMPS > solver(solver_control, mpi_communicator); > > > solver.solve (system_matrix, > locally_owned_solution, > system_rhs); // error while solving > > > The error that is printed is > > ERR: ERROR : NBROWS > NBROWF > ERR: INODE = 15525 > ERR: NBROW= 1 NBROWF= 0 > ERR: ROW_LIST= 2 > application called MPI_Abort(MPI_COMM_WORLD, -99) - process 0 > [cli_0]: aborting job: > application called MPI_Abort(MPI_COMM_WORLD, -99) - process 0 > application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0 > [cli_0]: aborting job: > application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0 > > > My code uses MPI based parallelism so it uses > parallel::distributed::triangulation. > > I am using deal.ii8.3.0 , petsc 3.6.0 and p4est 1.1 . > > Any help will be appreciated. > Thanks. > -- The deal.II project is located at http://www.dealii.org/ For mailing list/forum options, see https://groups.google.com/d/forum/dealii?hl=en --- You received this message because you are subscribed to the Google Groups "deal.II User Group" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. For more options, visit https://groups.google.com/d/optout.
