Re: [deal.II] Error when compiling a deal.II app using Trilinos on Apple M1

2022-03-14 Thread Timo Heister
I worked on that. 

Interestingly, I didn't see this error in my testing of ASPECT. Are
you saying that a deal.II step like step-32 fails? I think we should
try updating Trilinos first.

Do you want to try 13.2 and see if that works? If not, I will update
the package in candi master when I get to it in a couple of days.


On Mon, Mar 14, 2022, 13:25 blais...@gmail.com  wrote:
>
> Dear all,
> Hope you are well
> Using the candi master branch, I succeeded in compiling deal.II using candi. 
> Everything was smooth. I don't know who took care of that, but awesome work.
>
> When trying to compile an example that uses Trilinos (any), I get the 
> following error that arise from an include in Trilinos
>
> Users/blaisb/work/candi/trilinos-release-12-18-1/include/Tpetra_Import_def.hpp:1171:24:
>  error: no member named 'bind1st' in namespace 'std'; did you mean 
> 'boost::container::bind1st'?
>
>std::bind1st (std::equal_to (), -1));
>
>^
>
> /Users/blaisb/work/candi/deal.II-master/include/deal.II/bundled/boost/container/detail/algorithm.hpp:55:24:
>  note: 'boost::container::bind1st' declared here
>
> inline binder1st bind1st(const Func& func, const T& arg)
>
>
>
> Is there a way around this error? Should I try to manually fix it or switch 
> to another trilinos version?
>
> Thanks!
>
> Bruno
>
>
> --
> The deal.II project is located at http://www.dealii.org/
> For mailing list/forum options, see 
> https://groups.google.com/d/forum/dealii?hl=en
> ---
> You received this message because you are subscribed to the Google Groups 
> "deal.II User Group" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email to dealii+unsubscr...@googlegroups.com.
> To view this discussion on the web visit 
> https://groups.google.com/d/msgid/dealii/fc6ad924-8776-4fa8-8a6f-5c96c44a0c47n%40googlegroups.com.

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/CAMRj59GbboqXinss18TRSOSf2BipAAze3C0z8YQ2HSovkcU8Sg%40mail.gmail.com.


[deal.II] Error when compiling a deal.II app using Trilinos on Apple M1

2022-03-14 Thread blais...@gmail.com
Dear all,
Hope you are well
Using the candi master branch, I succeeded in compiling deal.II using 
candi. Everything was smooth. I don't know who took care of that, but 
awesome work.

When trying to compile an example that uses Trilinos (any), I get the 
following error that arise from an include in Trilinos

*Users/blaisb/work/candi/trilinos-release-12-18-1/include/Tpetra_Import_def.hpp:1171:24:
 
**error: **no member named 'bind1st' in namespace 'std'; did you mean 
'boost::container::bind1st'?*

   std::bind1st (std::equal_to (), -1));

*   ^*

*/Users/blaisb/work/candi/deal.II-master/include/deal.II/bundled/boost/container/detail/algorithm.hpp:55:24:
 
note: *'boost::container::bind1st' declared here

inline binder1st bind1st(const Func& func, const T& arg)



Is there a way around this error? Should I try to manually fix it or switch 
to another trilinos version?

Thanks!

Bruno


-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/fc6ad924-8776-4fa8-8a6f-5c96c44a0c47n%40googlegroups.com.


Re: [deal.II] Re: Assemble function, long time

2022-03-14 Thread Bruno Turcksin
Hermes,

Sorry, I don't use petsc. Maybe someone else can help you.

Best,

Bruno

Le lun. 14 mars 2022 à 05:42, Hermes Sampedro  a
écrit :

> Dear Bruno,
>
> I have been reading the examples and documents you pointed out. I tried to
> use SolvereGREMS with PreconditionILU. However, I am getting a running
> error that I can not really understand when calling
> PETScWrappers::PreconditionILU preconditioner(system_matrix). However, it
> seems to work with  PETScWrappers::PreconditionBlockJacobi
> 
> preconditioner(system_matrix). The solver function ad error that I get is
> the following:
>
> *void LaplaceProblem::solve()*
>
> *   {*
>
> *   PETScWrappers::MPI::Vector
> completely_distributed_solution(locally_owned_dofs,mpi_communicator);*
>
> * SolverControl cn(completely_distributed_solution.size(), 1e-8 *
> system_rhs.l2_norm());*
>
> *  PETScWrappers::SolverGMRES solver(cn, mpi_communicator);*
>
> *   PETScWrappers::PreconditionILU preconditioner(system_matrix);*
>
> * solver.solve(system_matrix, completely_distributed_solution,
> system_rhs, preconditioner); *
>
> *   constraints.distribute(completely_distributed_solution);*
>
> *   locally_relevant_solution = completely_distributed_solution;*
>
> *   }*
>
>
> [0]PETSC ERROR: - Error Message
> --
>
> [0]PETSC ERROR: See
> https://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html for
> possible LU and Cholesky solvers
>
> [0]PETSC ERROR: Could not locate a solver package for factorization type
> ILU and matrix type mpiaij.
>
> [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
>
> [0]PETSC ERROR: Petsc Release Version 3.13.1, May 02, 2020
>
> [0]PETSC ERROR: ./waveLaplaceSolver on a  named gbarlogin1 by hsllo Fri
> Mar 11 11:05:23 2022
>
> [0]PETSC ERROR: Configure options
> --prefix=/zhome/32/9/115503/dealii-candi/petsc-3.13.1 --with-debugging=0
> --with-shared-libraries=1 --with-mpi=1 --with-x=0 --with-64-bit-indices=0
> --download-hypre=1 CC=mpicc CXX=mpicxx FC=mpif90
> --with-blaslapack-dir=/appl/OpenBLAS/0.3.17/XeonE5-2660v3/gcc-11.2.0/lib
> --with-parmetis-dir=/zhome/32/9/115503/dealii-candi/parmetis-4.0.3
> --with-metis-dir=/zhome/32/9/115503/dealii-candi/parmetis-4.0.3
> --download-scalapack=1 --download-mumps=1
>
> [0]PETSC ERROR: #1 MatGetFactor() line 4492 in
> /zhome/32/9/115503/dealii-candi/tmp/build/petsc-lite-3.13.1/src/mat/interface/matrix.c
>
> [0]PETSC ERROR: #2 PCSetUp_ILU() line 133 in
> /zhome/32/9/115503/dealii-candi/tmp/build/petsc-lite-3.13.1/src/ksp/pc/impls/factor/ilu/ilu.c
>
> [0]PETSC ERROR: #3 PCSetUp() line 894 in
> /zhome/32/9/115503/dealii-candi/tmp/build/petsc-lite-3.13.1/src/ksp/pc/interface/precon.c
>
> 
>
> Exception on processing:
>
> 
>
> An error occurred in line <431> of file
> 
> in function
>
> void dealii::PETScWrappers::PreconditionILU::initialize(const
> dealii::PETScWrappers::MatrixBase&, const
> dealii::PETScWrappers::PreconditionILU::AdditionalData&)
>
> The violated condition was:
>
> ierr == 0
>
> Additional information:
>
> deal.II encountered an error while calling a PETSc function.
>
> The description of the error provided by PETSc is "See
>
> https://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html for
>
> possible LU and Cholesky solvers".
>
> The numerical value of the original error code is 92.
>
>
>
>
>
> Could you please help me to understand what is happening?
>
> Thank you again for your help
> Regards,
> H
> El viernes, 11 de marzo de 2022 a las 10:47:37 UTC+1, Hermes Sampedro
> escribió:
>
>> Dear Bruno and Wolfgang,
>>
>> thank you very much for your comments and help, it is very helpful.
>> Actually, I think that is what I am experiencing. When running with my
>> actual direct solver a system with 15 elements per direction (4th
>> polynomial order with 0.5million dof), the solver takes 50 seconds.
>> However, increasing to 30 elements per direction (3.5 million dof) the
>> solver takes 1.5 hours. I think this shows how it does not scale well in
>> terms of time as you mentioned. I will definitely try with the iterative
>> solver.
>>
>> Thannk you again
>> Regards,
>> H.
>> El jueves, 10 de marzo de 2022 a las 17:17:47 UTC+1, Wolfgang Bangerth
>> escribió:
>>
>>> On 3/10/22 07:00, Hermes Sampedro wrote:
>>> > I am experiencing long computational times with the solver function.
>>> I am
>>> > trying to use DoFRenumbering::Cuthill_McKee(dof_handler),
>>> > DoFRenumbering::boost::Cuthill_McKee(dof_handler,false,false)
>>> > but I get even higher computational times. Am I doing something wrong?
>>>
>>> Renumbering makes an enormous difference for sparse direct solvers, and
>>> as a
>>> 

Re: [deal.II] Re: Assemble function, long time

2022-03-14 Thread Hermes Sampedro
 

Dear Bruno,

I have been reading the examples and documents you pointed out. I tried to 
use SolvereGREMS with PreconditionILU. However, I am getting a running 
error that I can not really understand when calling 
PETScWrappers::PreconditionILU preconditioner(system_matrix). However, it 
seems to work with  PETScWrappers::PreconditionBlockJacobi 

 
preconditioner(system_matrix). The solver function ad error that I get is 
the following:

*void LaplaceProblem::solve()*

*   {*

*   PETScWrappers::MPI::Vector 
completely_distributed_solution(locally_owned_dofs,mpi_communicator);*

* SolverControl cn(completely_distributed_solution.size(), 1e-8 * 
system_rhs.l2_norm());*

*  PETScWrappers::SolverGMRES solver(cn, mpi_communicator);*

*   PETScWrappers::PreconditionILU preconditioner(system_matrix);*

* solver.solve(system_matrix, completely_distributed_solution, 
system_rhs, preconditioner); *

*   constraints.distribute(completely_distributed_solution);*

*   locally_relevant_solution = completely_distributed_solution;*

*   }*


[0]PETSC ERROR: - Error Message 
--

[0]PETSC ERROR: See 
https://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html for 
possible LU and Cholesky solvers

[0]PETSC ERROR: Could not locate a solver package for factorization type 
ILU and matrix type mpiaij.

[0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html 
for trouble shooting.

[0]PETSC ERROR: Petsc Release Version 3.13.1, May 02, 2020 

[0]PETSC ERROR: ./waveLaplaceSolver on a  named gbarlogin1 by hsllo Fri Mar 
11 11:05:23 2022

[0]PETSC ERROR: Configure options 
--prefix=/zhome/32/9/115503/dealii-candi/petsc-3.13.1 --with-debugging=0 
--with-shared-libraries=1 --with-mpi=1 --with-x=0 --with-64-bit-indices=0 
--download-hypre=1 CC=mpicc CXX=mpicxx FC=mpif90 
--with-blaslapack-dir=/appl/OpenBLAS/0.3.17/XeonE5-2660v3/gcc-11.2.0/lib 
--with-parmetis-dir=/zhome/32/9/115503/dealii-candi/parmetis-4.0.3 
--with-metis-dir=/zhome/32/9/115503/dealii-candi/parmetis-4.0.3 
--download-scalapack=1 --download-mumps=1

[0]PETSC ERROR: #1 MatGetFactor() line 4492 in 
/zhome/32/9/115503/dealii-candi/tmp/build/petsc-lite-3.13.1/src/mat/interface/matrix.c

[0]PETSC ERROR: #2 PCSetUp_ILU() line 133 in 
/zhome/32/9/115503/dealii-candi/tmp/build/petsc-lite-3.13.1/src/ksp/pc/impls/factor/ilu/ilu.c

[0]PETSC ERROR: #3 PCSetUp() line 894 in 
/zhome/32/9/115503/dealii-candi/tmp/build/petsc-lite-3.13.1/src/ksp/pc/interface/precon.c



Exception on processing: 



An error occurred in line <431> of file 

 
in function

void dealii::PETScWrappers::PreconditionILU::initialize(const 
dealii::PETScWrappers::MatrixBase&, const 
dealii::PETScWrappers::PreconditionILU::AdditionalData&)

The violated condition was: 

ierr == 0

Additional information: 

deal.II encountered an error while calling a PETSc function.

The description of the error provided by PETSc is "See

https://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html for

possible LU and Cholesky solvers".

The numerical value of the original error code is 92.





Could you please help me to understand what is happening?

Thank you again for your help
Regards, 
H
El viernes, 11 de marzo de 2022 a las 10:47:37 UTC+1, Hermes Sampedro 
escribió:

> Dear Bruno and Wolfgang, 
>
> thank you very much for your comments and help, it is very helpful. 
> Actually, I think that is what I am experiencing. When running with my 
> actual direct solver a system with 15 elements per direction (4th 
> polynomial order with 0.5million dof), the solver takes 50 seconds. 
> However, increasing to 30 elements per direction (3.5 million dof) the 
> solver takes 1.5 hours. I think this shows how it does not scale well in 
> terms of time as you mentioned. I will definitely try with the iterative 
> solver.
>
> Thannk you again
> Regards, 
> H.
> El jueves, 10 de marzo de 2022 a las 17:17:47 UTC+1, Wolfgang Bangerth 
> escribió:
>
>> On 3/10/22 07:00, Hermes Sampedro wrote: 
>> > I am experiencing long computational times with the solver function.  I 
>> am 
>> > trying to use DoFRenumbering::Cuthill_McKee(dof_handler), 
>> > DoFRenumbering::boost::Cuthill_McKee(dof_handler,false,false) 
>> > but I get even higher computational times. Am I doing something wrong? 
>>
>> Renumbering makes an enormous difference for sparse direct solvers, and 
>> as a 
>> consequence all such solvers I know of do it internally (though they use 
>> variations of the "minimum degree" renumbering, rather than 
>> Cuthill-McKee). As 
>> a consequence, renumbering yourself likely makes no difference. 
>>
>> But, as you discover and as Bruno already pointed out,