Dear DuMux

Thanks a lot for your quick answer.
I would like to have your confirmation about superLU.
I receive a warning each time that I use AMG backend as it is used without 
SuperLU.
We would like to install superLU, however we found three version ( sequential, 
shred memory parallel and distributed memory parallel).
As we  perform parallel computation, we assume that we should use superLU_DIST.
Can you please confirm our assumption? Is superLU_DIST compatible with DuMux?

We wish you and all DuMux team a Great, Prosperous, Blissful, Healthy, 
Energetic and Extremely Happy, HAPPY NEW YEAR 2014.
Best Regards
Ali NOWAMOOZ


From: Bernd Flemisch [mailto:[email protected]]
Sent: lundi 16 décembre 2013 02:03
To: [email protected]; NOWAMOOZ Ali
Subject: Re: [DuMuX] Error using Alu2dGrid and AMG solver

Hi Ali,

The AMGBackend only works for parallel grids, namely, UG, Yasp and 3d-Alu. It 
does not matter whether you actually run it sequentially or in parallel.

For all sequential runs, and particularly for all sequential grids, you can use 
SeqAMGBackend by changing the LinearSolver property in your problem file.

Kind regards
Bernd


[email protected]<mailto:[email protected]> schrieb:
Dear DuMux,

In test 1p with Amg, I tried  to use 2D Alugrid (Dune::ALUGrid<2, 2, 
Dune::cube, Dune::conforming> type;) with an Amg backend linear solver but I 
received an error message (please see below or attached file for complete 
terminal message). I also had the same problem with typedef Dune::SGrid<2, 2> 
type;.  I didn’t receive this error  when I used typedef Dune::YaspGrid<2> 
type; or in 3D (only with 2D Alugrid and SGrid).

Could you please help me with this problem?

Best regards
Ali NOWAMOOZ

test_box1pwithamg.cc:69:58:   required from here
/rap/jda-332-aa/apps/dumux-2.4.0-2/dune-pdelab-1.1.0/dune/pdelab/backend/novlpistlsolverbackend.hh:1324:77:
 error: no matching function for call to 
âDune::OwnerOverlapCopyCommunication<Dune::bigunsignedint<96>, 
int>::OwnerOverlapCopyCommunication(const CollectiveCommunication&, 
Dune::SolverCategory::Category)â
/rap/jda-332-aa/apps/dumux-2.4.0-2/dune-pdelab-1.1.0/dune/pdelab/backend/novlpistlsolverbackend.hh:1324:77:
 note: candidates are:
In file included from 
/rap/jda-332-aa/apps/dumux-2.4.0-2/dune-pdelab-1.1.0/dune/pdelab/backend/novlpistlsolverbackend.hh:15:0,
                 from ../../../dumux/linear/amgbackend.hh:30,
                 from 1ptestproblem.hh:47,
                 from test_box1pwithamg.cc:31:
/rap/jda-332-aa/apps/dumux-2.4.0-2/dune-istl-2.2.1/dune/istl/owneroverlapcopy.hh:668:5:
 note: Dune::OwnerOverlapCopyCommunication<T1, 
T2>::OwnerOverlapCopyCommunication(const 
Dune::OwnerOverlapCopyCommunication<T1, T2>&) [with GlobalIdType = 
Dune::bigunsignedint<96>; LocalIdType = int; 
Dune::OwnerOverlapCopyCommunication<T1, T2> = 
Dune::OwnerOverlapCopyCommunication<Dune::bigunsignedint<96>, int>]
/rap/jda-332-aa/apps/dumux-2.4.0-2/dune-istl-2.2.1/dune/istl/owneroverlapcopy.hh:668:5:
 note:   candidate expects 1 argument, 2 provided
/rap/jda-332-aa/apps/dumux-2.4.0-2/dune-istl-2.2.1/dune/istl/owneroverlapcopy.hh:589:2:
 note: Dune::OwnerOverlapCopyCommunication<T1, 
T2>::OwnerOverlapCopyCommunication(const Dune::IndexInfoFromGrid<GlobalIdType, 
LocalIdType>&, MPI_Comm, Dune::SolverCategory::Category, bool) [with 
GlobalIdType = Dune::bigunsignedint<96>; LocalIdType = int; MPI_Comm = 
ompi_communicator_t*]
/rap/jda-332-aa/apps/dumux-2.4.0-2/dune-istl-2.2.1/dune/istl/owneroverlapcopy.hh:589:2:
 note:   no known conversion for argument 1 from âconst CollectiveCommunication 
{aka const Dune::CollectiveCommunication<Dune::ALU2dGrid<2, 2, 
(ALU2DGrid::ElementType)1u> >}â to âconst 
Dune::IndexInfoFromGrid<Dune::bigunsignedint<96>, int>&â
/rap/jda-332-aa/apps/dumux-2.4.0-2/dune-istl-2.2.1/dune/istl/owneroverlapcopy.hh:575:5:
 note: Dune::OwnerOverlapCopyCommunication<T1, 
T2>::OwnerOverlapCopyCommunication(Dune::SolverCategory::Category) [with 
GlobalIdType = Dune::bigunsignedint<96>; LocalIdType = int]
/rap/jda-332-aa/apps/dumux-2.4.0-2/dune-istl-2.2.1/dune/istl/owneroverlapcopy.hh:575:5:
 note:   candidate expects 1 argument, 2 provided
/rap/jda-332-aa/apps/dumux-2.4.0-2/dune-istl-2.2.1/dune/istl/owneroverlapcopy.hh:557:5:
 note: Dune::OwnerOverlapCopyCommunication<T1, 
T2>::OwnerOverlapCopyCommunication(MPI_Comm, Dune::SolverCategory::Category, 
bool) [with GlobalIdType = Dune::bigunsignedint<96>; LocalIdType = int; 
MPI_Comm = ompi_communicator_t*]
/rap/jda-332-aa/apps/dumux-2.4.0-2/dune-istl-2.2.1/dune/istl/owneroverlapcopy.hh:557:5:
 note:   no known conversion for argument 1 from âconst CollectiveCommunication 
{aka const Dune::CollectiveCommunication<Dune::ALU2dGrid<2, 2, 
(ALU2DGrid::ElementType)1u> >}â to âMPI_Comm {aka ompi_communicator_t*}â
make: *** [test_box1pwithamg.o] Error 1
_______________________________________________
Dumux mailing list
[email protected]
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux

Reply via email to