Dear Bruno,

I have been reading the examples and documents you pointed out. I tried to 
use SolvereGREMS with PreconditionILU. However, I am getting a running 
error that I can not really understand when calling 
PETScWrappers::PreconditionILU preconditioner(system_matrix). However, it 
seems to work with  PETScWrappers::PreconditionBlockJacobi 
<https://dealii.org/current/doxygen/deal.II/classPETScWrappers_1_1PreconditionBlockJacobi.html>
 
preconditioner(system_matrix). The solver function ad error that I get is 
the following:

*void LaplaceProblem<dim>::solve()*

*   {*

*       PETScWrappers::MPI::Vector 
completely_distributed_solution(locally_owned_dofs,mpi_communicator);*

*         SolverControl cn(completely_distributed_solution.size(), 1e-8 * 
system_rhs.l2_norm());*

*          PETScWrappers::SolverGMRES solver(cn, mpi_communicator);*

*       PETScWrappers::PreconditionILU preconditioner(system_matrix);*

*         solver.solve(system_matrix, completely_distributed_solution, 
system_rhs, preconditioner); *

*       constraints.distribute(completely_distributed_solution);*

*       locally_relevant_solution = completely_distributed_solution;*

*   }*


[0]PETSC ERROR: --------------------- Error Message 
--------------------------------------------------------------

[0]PETSC ERROR: See 
https://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html for 
possible LU and Cholesky solvers

[0]PETSC ERROR: Could not locate a solver package for factorization type 
ILU and matrix type mpiaij.

[0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html 
for trouble shooting.

[0]PETSC ERROR: Petsc Release Version 3.13.1, May 02, 2020 

[0]PETSC ERROR: ./waveLaplaceSolver on a  named gbarlogin1 by hsllo Fri Mar 
11 11:05:23 2022

[0]PETSC ERROR: Configure options 
--prefix=/zhome/32/9/115503/dealii-candi/petsc-3.13.1 --with-debugging=0 
--with-shared-libraries=1 --with-mpi=1 --with-x=0 --with-64-bit-indices=0 
--download-hypre=1 CC=mpicc CXX=mpicxx FC=mpif90 
--with-blaslapack-dir=/appl/OpenBLAS/0.3.17/XeonE5-2660v3/gcc-11.2.0/lib 
--with-parmetis-dir=/zhome/32/9/115503/dealii-candi/parmetis-4.0.3 
--with-metis-dir=/zhome/32/9/115503/dealii-candi/parmetis-4.0.3 
--download-scalapack=1 --download-mumps=1

[0]PETSC ERROR: #1 MatGetFactor() line 4492 in 
/zhome/32/9/115503/dealii-candi/tmp/build/petsc-lite-3.13.1/src/mat/interface/matrix.c

[0]PETSC ERROR: #2 PCSetUp_ILU() line 133 in 
/zhome/32/9/115503/dealii-candi/tmp/build/petsc-lite-3.13.1/src/ksp/pc/impls/factor/ilu/ilu.c

[0]PETSC ERROR: #3 PCSetUp() line 894 in 
/zhome/32/9/115503/dealii-candi/tmp/build/petsc-lite-3.13.1/src/ksp/pc/interface/precon.c

----------------------------------------------------

Exception on processing: 

--------------------------------------------------------

An error occurred in line <431> of file 
</zhome/32/9/115503/dealii-candi/tmp/unpack/deal.II-v9.3.1/source/lac/petsc_precondition.cc>
 
in function

    void dealii::PETScWrappers::PreconditionILU::initialize(const 
dealii::PETScWrappers::MatrixBase&, const 
dealii::PETScWrappers::PreconditionILU::AdditionalData&)

The violated condition was: 

    ierr == 0

Additional information: 

    deal.II encountered an error while calling a PETSc function.

    The description of the error provided by PETSc is "See

    https://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html for

    possible LU and Cholesky solvers".

    The numerical value of the original error code is 92.





Could you please help me to understand what is happening?

Thank you again for your help
Regards, 
H
El viernes, 11 de marzo de 2022 a las 10:47:37 UTC+1, Hermes Sampedro 
escribió:

> Dear Bruno and Wolfgang, 
>
> thank you very much for your comments and help, it is very helpful. 
> Actually, I think that is what I am experiencing. When running with my 
> actual direct solver a system with 15 elements per direction (4th 
> polynomial order with 0.5million dof), the solver takes 50 seconds. 
> However, increasing to 30 elements per direction (3.5 million dof) the 
> solver takes 1.5 hours. I think this shows how it does not scale well in 
> terms of time as you mentioned. I will definitely try with the iterative 
> solver.
>
> Thannk you again
> Regards, 
> H.
> El jueves, 10 de marzo de 2022 a las 17:17:47 UTC+1, Wolfgang Bangerth 
> escribió:
>
>> On 3/10/22 07:00, Hermes Sampedro wrote: 
>> > I am experiencing long computational times with the solver function.  I 
>> am 
>> > trying to use DoFRenumbering::Cuthill_McKee(dof_handler), 
>> > DoFRenumbering::boost::Cuthill_McKee(dof_handler,false,false) 
>> > but I get even higher computational times. Am I doing something wrong? 
>>
>> Renumbering makes an enormous difference for sparse direct solvers, and 
>> as a 
>> consequence all such solvers I know of do it internally (though they use 
>> variations of the "minimum degree" renumbering, rather than 
>> Cuthill-McKee). As 
>> a consequence, renumbering yourself likely makes no difference. 
>>
>> But, as you discover and as Bruno already pointed out, even with optimal 
>> ordering, direct solvers do not scale well both in terms of overall time 
>> and 
>> in parallelism. You may want to take a look at the several video lectures 
>> on 
>> solvers and preconditioners to see what you can do about your case. 
>>
>> Best 
>> W. 
>>
>> -- 
>> ------------------------------------------------------------------------ 
>> Wolfgang Bangerth email: [email protected] 
>> www: http://www.math.colostate.edu/~bangerth/ 
>>
>>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/7f6384f4-ebae-4843-83e0-a17abb915e62n%40googlegroups.com.

Reply via email to