Thanks to Jean-Paul and Luca. This works perfectly.

Please delete my duplicate post on this topic.

Best
praveen

On Fri, Jun 24, 2016 at 2:25 PM, luca.heltai <[email protected]> wrote:

> What J-P is suggesting is the following:
>
> Have a
>
> std_cxx11::shared_pointer<TrilinosWrappers::SolverDirect> solver
>
> member in your class, then
>
>
> if (!solver) {
>         solver = std_cxx11::shared_pointer<TrilinosWrappers::SolverDirect
> direct>(new TrilinosWrappers::SolverDirect direct(…) );
>         solver->initialize (mass_matrix);
> }
>
> this will be done only once, and will be destroyed correctly, avoiding MPI
> issues on exit.
>
> L.
>
>
> > On 24 Jun 2016, at 9:10, Jean-Paul Pelteret <[email protected]>
> wrote:
> >
> > Praveen,
> >
> > I think that the problem is that you have an object that contains MPI
> data (like a MPI communicator) that is still in use/existence at the time
> that MPI_FINALIZE is called (generally by the
> Utilities::MPI::MPI_InitFinalize) object one creates in the main function.
> You need to ensure that your Solver object is destroyed before the
> MPI_InitFinalize object goes out of scope. What you could do is have a
> (smart-)pointer to your solver in your main class, and create and
> initialise it when (first_time == true), and then it gets removed when the
> class destructor is called.
> >
> > J-P
> >
> > On Friday, June 24, 2016 at 8:38:59 AM UTC+2, Praveen C wrote:
> > Dear all
> >
> > I am using the new TrilinosWrappers::SolverDirect which allows two step
> solution: initialize and solve. Since I want to calculate LU decomposition
> only once, I do something like this
> >
> >       static int first_time = 1;
> >
> >       static TrilinosWrappers::SolverDirect::AdditionalData data (false,
> "Amesos_Mumps");
> >       static SolverControl solver_control (1, 0);
> >       static TrilinosWrappers::SolverDirect direct (solver_control,
> data);
> >
> >       // If it is first time, compute LU decomposition
> >       if(first_time)
> >       {
> >          pcout << "Performing LU decomposition\n";
> >          direct.initialize (mass_matrix);
> >          first_time = 0;
> >       }
> >
> >       // solve for ax
> >       {
> >          TrilinosWrappers::MPI::Vector tmp (locally_owned_dofs,
> mpi_communicator);
> >          direct.solve (tmp, rhs_ax);
> >          constraints.distribute (tmp);
> >          ax = tmp;
> >       }
> >
> > This works but when the program finishes, I get this message.
> >
> >
> > *** The MPI_Comm_f2c() function was called after MPI_FINALIZE was
> invoked.
> > *** This is disallowed by the MPI standard.
> > *** Your MPI job will now abort.
> > [(null):4853] Local abort after MPI_FINALIZE completed successfully; not
> able to aggregate error messages, and not able to guarantee that all other
> processes were killed!
> > *** The MPI_Comm_f2c() function was called after MPI_FINALIZE was
> invoked.
> > *** This is disallowed by the MPI standard.
> > *** Your MPI job will now abort.
> > [(null):4854] Local abort after MPI_FINALIZE completed successfully; not
> able to aggregate error messages, and not able to guarantee that all other
> processes were killed!
> > -------------------------------------------------------
> > Primary job  terminated normally, but 1 process returned
> > a non-zero exit code.. Per user-direction, the job has been aborted.
> > -------------------------------------------------------
> >
> --------------------------------------------------------------------------
> > mpirun detected that one or more processes exited with non-zero status,
> thus causing
> > the job to be terminated. The first process to do so was:
> >
> >   Process name: [[14617,1],0]
> >   Exit code:    1
> > —————————————————————————————————————
> >
> > Removing “static” for the solver and recomputing it everytime gets rid
> of this error. So I am guessing it is the “static” qualifier that is
> causing problem. How can I fix this issue ?
> >
> > Thanks
> > praveen
> >
> >
> > --
> > The deal.II project is located at http://www.dealii.org/
> > For mailing list/forum options, see
> https://groups.google.com/d/forum/dealii?hl=en
> > ---
> > You received this message because you are subscribed to the Google
> Groups "deal.II User Group" group.
> > To unsubscribe from this group and stop receiving emails from it, send
> an email to [email protected].
> > For more options, visit https://groups.google.com/d/optout.
>
> --
> The deal.II project is located at http://www.dealii.org/
> For mailing list/forum options, see
> https://groups.google.com/d/forum/dealii?hl=en
> ---
> You received this message because you are subscribed to the Google Groups
> "deal.II User Group" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> For more options, visit https://groups.google.com/d/optout.
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to