Hi Mona,
Dune 2.6 and DuMuX 3.0 are a bit outdated. Can you reproduce your issue with the latest stable versions of DuMuX, Dune 2.9 and SPGrid 2.9, too? I know, easier said then done.
Does this problem occur with YaspGrid and >7 cores?

I am not sure whether you are hitting a SPGrid issues or a DuMuX issue.

Bye
Christoph


Am 24.10.23 um 11:52 schrieb Giraud, Mona:
Dear DuMux team,
I use dumux 3.0 and dune 2.6.
I encountered an issue when computing a problem using 3D SPgrid on ‘too many’ cores. Indeed, it looks like the cells’ ids according to Dune::GlobalIndexSet is incorrect when the core/cell ratio is too high. I set up a small example using an adapted version of 1st exercise of the dumux course (see zip folder attached).


##############  changes made to 'exercise_basic_2pni_solution' (from DuMux 3.0):


In injection2pniproblem.hh:

#include <dune/grid/spgrid.hh>
[…]
// Set the grid type
template<class TypeTag>
struct Grid<TypeTag, TTag::Injection2pNITypeTag> { using type = Dune::SPGrid<GetPropType<TypeTag, Properties::Scalar>, 3>; };
####
####
In exercise_basic_2pni.cc:
// getDofIndices, getPointIndices, getCellIndices
#include <dune/grid/utility/globalindexset.hh>

[...]
int main(int argc, char** argv) try
{
     using namespace Dumux;

     // define the type tag for this problem
     using TypeTag = Properties::TTag::Injection2pNICC;

     // initialize MPI, finalize is done automatically on exit
     const auto& mpiHelper = Dune::MPIHelper::instance(argc, argv);

     // print dumux start message
     if (mpiHelper.rank() == 0)
         DumuxMessage::print(/*firstCall=*/true);

     // parse command line arguments and input file
     Parameters::init(argc, argv);

     // try to create a grid (from the given grid file or the input file)
     GridManager<GetPropType<TypeTag, Properties::Grid>> gridManager;
     gridManager.init();

     ////////////////////////////////////////////////////////////
     // run instationary non-linear problem on this grid
     ////////////////////////////////////////////////////////////

     // we compute on the leaf grid view
     const auto& leafGridView = gridManager.grid().leafGridView();

     // create the finite volume grid geometry
    using FVGridGeometry = GetPropType<TypeTag, Properties::FVGridGeometry>;
     using SoilGridType = GetPropType<TypeTag, Properties::Grid>;
using GridView = typename SoilGridType::Traits::LeafGridView;
enum {
dimWorld = GridView::dimensionworld
};
     auto fvGridGeometry = std::make_shared<FVGridGeometry>(leafGridView);
     fvGridGeometry->update();

auto cellIdx = std::make_shared<Dune::GlobalIndexSet<GridView>>(gridManager.grid().leafGridView(), 0); std::cout<<"getCellCenters rank:"<<mpiHelper.rank()<<" dimWorld:"<<dimWorld <<"\n";
         for (const auto& e : elements(fvGridGeometry->gridView())) {
             auto p = e.geometry().center();
int gIdx = cellIdx->index(e);//get the cell global index
std::cout<<"gIdx "<<gIdx;
             for (int i=0; i< 3 ; i++) { // print cell center coordinates
std::cout<<", "<<i<<":"<<p[i];
             }std::cout<<std::endl;
         }

     ////////////////////////////////////////////////////////////
     // finalize, print dumux message to say goodbye
     ////////////////////////////////////////////////////////////
     // print dumux end message
     if (mpiHelper.rank() == 0)
     {
         Parameters::print();
         DumuxMessage::print(/*firstCall=*/false);
     }

     return 0;
} // end main

####
####
  in exercise_basic_2pni_solution.input:
[Grid]
LowerLeft = 0 0 0
UpperRight = 10 10 10
Cells = 2 2 2

############## Outputs:


# mpirun -n 1 ./exercise_basic_2pni_solution
getCellCenters rank:0 dimWorld:3
         gIdx 0, 0:2.5, 1:2.5, 2:2.5
         gIdx 1, 0:7.5, 1:2.5, 2:2.5
         gIdx 2, 0:2.5, 1:7.5, 2:2.5
         gIdx 3, 0:7.5, 1:7.5, 2:2.5
         gIdx 4, 0:2.5, 1:2.5, 2:7.5
         gIdx 5, 0:7.5, 1:2.5, 2:7.5
         gIdx 6, 0:2.5, 1:7.5, 2:7.5
         gIdx 7, 0:7.5, 1:7.5, 2:7.5

# mpirun -n 4 ./exercise_basic_2pni_solution
getCellCenters rank:1 dimWorld:3
         gIdx 0, 0:2.5, 1:2.5, 2:2.5
         gIdx 4, 0:7.5, 1:2.5, 2:2.5
         gIdx 2, 0:2.5, 1:7.5, 2:2.5
        gIdx 6, 0:7.5, 1:7.5, 2:2.5 // != gIdx 6, 0:2.5, 1:7.5, 2:7.5 obtained when using 1 core
         gIdx 1, 0:2.5, 1:2.5, 2:7.5
         gIdx 5, 0:7.5, 1:2.5, 2:7.5
         gIdx 3, 0:2.5, 1:7.5, 2:7.5
         gIdx 7, 0:7.5, 1:7.5, 2:7.5
getCellCenters rank:2 dimWorld:3
         gIdx 0, 0:2.5, 1:2.5, 2:2.5
         gIdx 4, 0:7.5, 1:2.5, 2:2.5
         gIdx 2, 0:2.5, 1:7.5, 2:2.5
         gIdx 6, 0:7.5, 1:7.5, 2:2.5
         gIdx 1, 0:2.5, 1:2.5, 2:7.5
         gIdx 5, 0:7.5, 1:2.5, 2:7.5
         gIdx 3, 0:2.5, 1:7.5, 2:7.5
         gIdx 7, 0:7.5, 1:7.5, 2:7.5
getCellCenters rank:3 dimWorld:3
         gIdx 0, 0:2.5, 1:2.5, 2:2.5
         gIdx 4, 0:7.5, 1:2.5, 2:2.5
         gIdx 2, 0:2.5, 1:7.5, 2:2.5
         gIdx 6, 0:7.5, 1:7.5, 2:2.5
         gIdx 1, 0:2.5, 1:2.5, 2:7.5
         gIdx 5, 0:7.5, 1:2.5, 2:7.5
         gIdx 3, 0:2.5, 1:7.5, 2:7.5
         gIdx 7, 0:7.5, 1:7.5, 2:7.5

##############


As can be seen, when 4 cores are used, the global cell index does not follow the 'default' dumux ordering (from the bottom left to the top right), although the local cell index still seems to follow it. The problem seems to disappear when the number of cells is higher. E.g., with :

[Grid]
LowerLeft = 0 0 0
UpperRight = 100 100 100
Cells = 5 5 20

Works again for a number of cores <= 7.

Would you know how to fix this issue? Indeed, in our simulation we would need to use a higher number of cores (> 7) with a rather small soil grid (~ 500 cells).
Many thanks for your help!
Best,
Mona
_______________________________________________
DuMux mailing list
[email protected]
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux

Reply via email to