Dear Bruno, thank you for your answer it helped me to solve the problem.

I had some issues with Trilinos library so I decided to implement step-29
in parallel (similar to step-40) using dealii::PETSxWrappers. I get an
error on the solve() function on the red line:

*template* <*int* dim>

*void* UltrasoundProblem<dim>::solve()

{

PETScWrappers::MPI::Vector
completely_distributed_solution(locally_owned_dofs,mpi_communicator);

SolverControl cn;

PETScWrappers::SparseDirectMUMPS solver(cn, mpi_communicator);

solver.solve(system_matrix, completely_distributed_solution, system_rhs);

constraints.distribute(completely_distributed_solution);

locally_relevant_solution = completely_distributed_solution;

}

where it was declared:

      dealii::PETScWrappers::SparseMatrix system_matrix;

      dealii::PETScWrappers::MPI::Vector       locally_relevant_solution;

      dealii::PETScWrappers::MPI::Vector       system_rhs;

[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------

[0]PETSC ERROR: Nonconforming object sizes

[0]PETSC ERROR: Preconditioner number of local rows 51842 does not equal
resulting vector number of rows 26114

[0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html
for trouble shooting.

[0]PETSC ERROR: Petsc Release Version 3.13.1, May 02, 2020

[0]PETSC ERROR: ./step-29 on a  named gbarlogin1 by hsllo Tue Sep 14
11:11:04 2021

[0]PETSC ERROR: Configure options
--prefix=/zhome/32/9/115503/dealii-candi/petsc-3.13.1 --with-debugging=0
--with-shared-libraries=1 --with-mpi=1 --with-x=0 --with-64-bit-indices=0
--download-hypre=1 CC=mpicc CXX=mpicxx FC=mpif90
--with-blaslapack-dir=/appl/OpenBLAS/0.3.17/XeonE5-2660v3/gcc-11.2.0/lib
--with-parmetis-dir=/zhome/32/9/115503/dealii-candi/parmetis-4.0.3
--with-metis-dir=/zhome/32/9/115503/dealii-candi/parmetis-4.0.3
--download-scalapack=1 --download-mumps=1

[0]PETSC ERROR: #1 PCApply() line 436 in
/zhome/32/9/115503/dealii-candi/tmp/build/petsc-lite-3.13.1/src/ksp/pc/interface/precon.c

[0]PETSC ERROR: #2 KSP_PCApply() line 281 in
/zhome/32/9/115503/dealii-candi/tmp/build/petsc-lite-3.13.1/include/petsc/private/kspimpl.h

[0]PETSC ERROR: #3 KSPSolve_PREONLY() line 22 in
/zhome/32/9/115503/dealii-candi/tmp/build/petsc-lite-3.13.1/src/ksp/ksp/impls/preonly/preonly.c

[0]PETSC ERROR: #4 KSPSolve_Private() line 694 in
/zhome/32/9/115503/dealii-candi/tmp/build/petsc-lite-3.13.1/src/ksp/ksp/interface/itfunc.c

[0]PETSC ERROR: #5 KSPSolve() line 853 in
/zhome/32/9/115503/dealii-candi/tmp/build/petsc-lite-3.13.1/src/ksp/ksp/interface/itfunc.c



----------------------------------------------------

Exception on processing:


--------------------------------------------------------

An error occurred in line <807> of file
</zhome/32/9/115503/dealii-candi/tmp/unpack/deal.II-v9.3.1/source/lac/petsc_solver.cc>
in function

    void dealii::PETScWrappers::SparseDirectMUMPS::solve(const
dealii::PETScWrappers::MatrixBase&, dealii::PETScWrappers::VectorBase&,
const dealii::PETScWrappers::VectorBase&)

The violated condition was:

    ierr == 0

Additional information:

    deal.II encountered an error while calling a PETSc function.

    The description of the error provided by PETSc is "Nonconforming

    object sizes".

    The numerical value of the original error code is 60.

--------------------------------------------------------


Aborting!

---------------------




Should I use dealii::PETScWrappers::MPI::SparseMatrix system_matrix
instead? If so could ou please help me to with the reinit() function? I do
not fully understand how to call it.


Thank you

Regards,

H

El vie, 10 sept 2021 a las 14:56, Bruno Turcksin (<[email protected]>)
escribió:

> Hermes,
>
> There is no recommended way to write a plain .txt file, it's left to the
> user to write their own function. The reason deal.II provides a function to
> write a .vtu file is that you need to use a very specific format. There is
> no format that you need to follow when writing a .txt file.
>
> Best,
>
> Bruno
>
> On Friday, September 10, 2021 at 5:58:45 AM UTC-4 [email protected]
> wrote:
>
>> Dear Bruno,
>>
>> Thank you very much for your help.
>>
>> I would kindly like to ask what is the recommended way in Dealii to write
>> on a plain .txt file the results of a single point when using MPI. I solve
>> for different frequencies and would like to write the solution after each
>> iteration. In the non-parallel version it works as follows:
>>
>> *ofstream myfile;*
>>
>> * myfile.open ("solution.txt");*
>>
>>
>>
>> *       for ( int freq = 0; freí< 10; ++i)*
>>
>> *        {*
>>
>> *        setup_system();*
>>
>> *        assemble_system(freq);*
>>
>> *        solve();*
>>
>> *         output_results(myfile);*
>>
>> *        }*
>>
>> *        myfile.close();*
>>
>>
>>
>>        The function  *output_results(myfile) *takes care of writing the
>> solution of a single point in a plain .txt file for each frequency.
>>
>>
>> Step-40 shows a way to write .vtu files format when using MPI, but I did
>> not find any function to write the plain solution.
>>
>> I would appreciate any suggestions on how to do that using Dealii.
>>
>>
>> Thank you again.
>>
>> Regards,
>>
>> H.
>>
>>
>>
>> El jue, 9 sept 2021 a las 14:36, Bruno Turcksin (<[email protected]>)
>> escribió:
>>
>>> Hermes,
>>>
>>> The following line is wrong:
>>> *DynamicSparsityPattern dsp(locally_relevant_dofs.n_elements(),
>>> locally_relevant_dofs.n_elements());*
>>>
>>> This constructor says that you want a sparsity pattern that has 
>>> *locally_relevant_dofs.n_elements()
>>> *elements. This is not the case. You want a sparsity pattern that has
>>> as many elements as the total numbers of dofs. Also since you want to use
>>> MPI, you need to provide a third argument, the locally_owned IndexSet.
>>>
>>> You can also simply use:
>>> *DynamicSparsityPattern dsp(locally_relevant_dofs);*
>>>
>>> Best,
>>>
>>> Bruno
>>>
>>> On Thursday, September 9, 2021 at 5:29:56 AM UTC-4 [email protected]
>>> wrote:
>>>
>>>> Dear all,
>>>>
>>>> I adapted step-29 to run in parallel, similar as step-40. With 1 MPI
>>>> rank it works, however, when using more than 1 rank I  get a running error
>>>> (attached is the full output)
>>>>
>>>> *An error occurred in line <74> of file
>>>> </var/folders/8z/hlb6vc015qjggytkxn84m6_c0000gn/T/heltai/spack-stage/spack-stage-dealii-9.3.0-zy7k3uwnakcqjvrajvacy5l4jrl7eaex/spack-src/source/dofs/dof_tools_sparsity.cc>
>>>> in function*
>>>>
>>>> *    void dealii::DoFTools::make_sparsity_pattern(const DoFHandler<dim,
>>>> spacedim> &, SparsityPatternType &, const AffineConstraints<number> &,
>>>> const bool, const types::subdomain_id) [dim = 2, spacedim = 2,
>>>> SparsityPatternType = dealii::DynamicSparsityPattern, number = double]*
>>>>
>>>> *The violated condition was: *
>>>>
>>>> *    sparsity.n_rows() == n_dofs*
>>>>
>>>> *Additional information: *
>>>>
>>>> *    Dimension 26752 not equal to 51842*
>>>>
>>>> I found out that the problem is in the setp_system() function. The
>>>> problematic line is in red. Could you please help me to figure out the
>>>> issue?
>>>>
>>>>
>>>> *template <int dim>*
>>>>
>>>> *  void UltrasoundProblem<dim>::setup_system()**  {*
>>>>
>>>> *    deallog << "Setting up system... ";*
>>>>
>>>> *     deallog << "OK1... ";*
>>>>
>>>> *    dof_handler.distribute_dofs(fe);*
>>>>
>>>>     * locally_owned_dofs = dof_handler.locally_owned_dofs();*
>>>>
>>>> *      DoFTools::extract_locally_relevant_dofs(dof_handler,
>>>> locally_relevant_dofs);*
>>>>
>>>> *      locally_relevant_solution.reinit(locally_owned_dofs, *
>>>> *locally_relevant_dofs,**mpi_communicator);*
>>>>
>>>> *      system_rhs.reinit(locally_owned_dofs, mpi_communicator);*
>>>>
>>>>      * constraints.clear();*
>>>>
>>>> *      constraints.reinit(locally_relevant_dofs);*
>>>>
>>>> *      DoFTools::make_hanging_node_constraints(dof_handler,
>>>> constraints);*
>>>>
>>>>    *  VectorTools::interpolate_boundary_values(dof_handler,** 1,*
>>>> *DirichletBoundaryValues<dim>(),**constraints);*
>>>>
>>>> *      constraints.close();*
>>>>
>>>> *    DynamicSparsityPattern dsp(locally_relevant_dofs.n_elements(),
>>>> locally_relevant_dofs.n_elements());*
>>>>
>>>> *    DoFTools::make_sparsity_pattern(dof_handler,
>>>> dsp,constraints,false);//THIS*
>>>>
>>>>
>>>> * 
>>>> SparsityTools::distribute_sparsity_pattern(dsp,dof_handler.locally_owned_dofs(),mpi_communicator,locally_relevant_dofs);*
>>>>
>>>> *   system_matrix.reinit(locally_owned_dofs,locally_owned_dofs, dsp,
>>>> mpi_communicator);*
>>>>
>>>> *  }*
>>>>
>>>>
>>>> Thank you very much
>>>>
>>>> H.
>>>>
>>>>
>>>>
>>>> --
>>> The deal.II project is located at http://www.dealii.org/
>>> For mailing list/forum options, see
>>> https://groups.google.com/d/forum/dealii?hl=en
>>> ---
>>> You received this message because you are subscribed to a topic in the
>>> Google Groups "deal.II User Group" group.
>>> To unsubscribe from this topic, visit
>>> https://groups.google.com/d/topic/dealii/r3NGr6TnxXs/unsubscribe.
>>> To unsubscribe from this group and all its topics, send an email to
>>> [email protected].
>>> To view this discussion on the web visit
>>> https://groups.google.com/d/msgid/dealii/f621e3ee-3bce-49b3-9151-1b5a3e4edc55n%40googlegroups.com
>>> <https://groups.google.com/d/msgid/dealii/f621e3ee-3bce-49b3-9151-1b5a3e4edc55n%40googlegroups.com?utm_medium=email&utm_source=footer>
>>> .
>>>
>> --
> The deal.II project is located at http://www.dealii.org/
> For mailing list/forum options, see
> https://groups.google.com/d/forum/dealii?hl=en
> ---
> You received this message because you are subscribed to a topic in the
> Google Groups "deal.II User Group" group.
> To unsubscribe from this topic, visit
> https://groups.google.com/d/topic/dealii/r3NGr6TnxXs/unsubscribe.
> To unsubscribe from this group and all its topics, send an email to
> [email protected].
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/dealii/205e6297-af04-4746-b2d1-c4c45805c69fn%40googlegroups.com
> <https://groups.google.com/d/msgid/dealii/205e6297-af04-4746-b2d1-c4c45805c69fn%40googlegroups.com?utm_medium=email&utm_source=footer>
> .
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/CAB%3DnHhZ05wuCng43ftp%3DVDmYMpGuNQ%3DteKbN9yUYw8d1uZijbA%40mail.gmail.com.

Reply via email to