Hi Mora,
this sounds like a grid problem and unrelated to DuMuX. You should try the dune user list. If you can create a minimal example that shows this problem, it might also worth filing a bug in the SPgrid GitLab project.

Bye
Christoph


Am 25.10.23 um 11:46 schrieb Giraud, Mona:
Dear Christoph,

I think the issue index problem might not linked to the code in the grid module itself but rather to the 'Dune::GlobalIndexSet' mapper.

When I take the global cell index from Grid::GlobalIdSet, I seem to always obtain the same index for a set cell centre location, no matter the number of threads.

code snippet:

#####
     typedef typename GridView::Grid Grid;
     typedef typename Grid::GlobalIdSet GlobalIdSet;

  const typename GridView::IndexSet& indexSet = gridManager.grid().leafGridView().indexSet(); /** local indexs*/     const GlobalIdSet & globalIdSet = gridManager.grid().globalIdSet();      /** retrieve globally unique Id set */ auto cellIdx = std::make_shared<Dune::GlobalIndexSet<GridView>>(gridManager.grid().leafGridView(), 0); /** <= possible incorrect mapper */ std::cout<<"getCellCenters rank:"<<mpiHelper.rank()<<" dimWorld:"<<dimWorld <<"\n";

         for (const auto& e : elements(fvGridGeometry->gridView())) {
             auto p = e.geometry().center();
             auto gIdx2 =  globalIdSet.id(e);   /**correct global id */
auto lIdx =  indexSet.index(e);
int gIdx = cellIdx->index(e);//not always correct global id
std::cout<<"gIdx "<<gIdx<<" gIdx2 "<<gIdx2<<<<" LIdx "<<lIdx;
             for (int i=0; i< 3 ; i++) { // print cell center coordinates
std::cout<<", "<<i<<":"<<p[i];
             }std::cout<<std::endl;
         }
#####

for:

[Grid]
LowerLeft = 0 0 0
UpperRight = 10 10 10
Cells = 2 2 2

run for SPgrid, we obtain the same "gIdx2" (from Grid::GlobalIndexSet) but different "gIdx" (from Dune::GlobalIndexSet ) according to the number of threads:

outputs (example for a specific cell, using 1 or 2 threads):
One thread:
rank 0 :  gIdx 24 gIdx2 281 gIdx3 24, 0:1.5, 1:7.5, 2:7.5
two threads:
                     rank 0 : gIdx 8 gIdx2 281 LIdx 16, 0:1.5, 1:7.5, 2:7.5 //  <= gId changes but gId2 remains the same                      rank 1 : gIdx 8 gIdx2 281 LIdx 24, 0:1.5, 1:7.5, 2:7.5//  <= gId changes but gId2 remains the same

So, unless the issue with the 'Dune::GlobalIndexSet' mapper is linked or will cause other problems in the code, the simplest solution is probably to create my own mapper to go from 'Grid::GlobalIdSet' (ordered but not contiguous ) to a contiguous set of index?
Best
Mona


------------------------------------------------------------------------
*From:* DuMux <dumux-boun...@listserv.uni-stuttgart.de> on behalf of Giraud, Mona <m.gir...@fz-juelich.de>
*Sent:* Wednesday, October 25, 2023 9:54:15 AM
*To:* dumux@listserv.uni-stuttgart.de
*Subject:* Re: [DuMux] global cell index for 3D SPgrid with MPI


Hello Christoph,
many thanks for your answer!
I did the tests for dumux 3.7 and dune (including spgrid) 2.9.

You can find attached once more the script of the excercise-basic (from dumux 3.7) adapted to write the global cell index and the cell centres.

exercise_basic_2p: YaspGrid
exercise_basic_2p2c: SPgrid

The code is almost the same as the one described in the previous email.


I. tests:
1)  6 cells on 4 cores
2) 500 cells on various number of cores

II. Results/observations:
1) YaspGrid will throw an error when the thread/cell ratio is too high:
 "what():  Dune::GridError [partition:/home/m.giraud/DUMUXlast2/dumux/dune-grid/dune/grid/yaspgrid/partitioning.hh:67]: Failed to find a suitable partition" Indeed, it looks like the DefaultPartitioning::partition() does check that the partitioning works. 2) YaspGrid can handle a higher thread/cell ratio, E.g., the limit is somewhere between 40 and 50 threads for 500 cells. 3) SPgrid gives the same output as before: no error thrown but the ordering of the global cell index does not go from "bottom left" to "top right" of the domain when we have > 1 thread for 6 cells or >7 threads for 500 cells. Indeed,  the equivalent function for SPgrid seems (?) to be SPPartitionPool::makePartition and it has no such warning.

III. Therefore my questions are:
1) I could not really understand the code in DefaultPartitioning::optimize_dims but it seems like there is a clear limit on the number of threads which can be used for a set number of cells with YaspGrid. Is there a set/known rule of thumb for YaspGrid (and SPgrid)? 2) is there no warning for SPgrid because, in theory, it should be able to handle any number of threads? i.e., if the issue with global index ordering is fixed, then the computation should work fine? 3) For our simulation, we have one 3D domain run in parallel and many 1D domains distributed between the threads and run sequentially. Therefore, using many threads leads to a high decrease in compute time. Is there a way to not limit the number of threads because of the 3D soil? Would it be possible/make sense to adapt the partitioning functions (ideally for SPgrid) so that if the threads/cell ratio is too high, some threads carry 0 cells of the 3D domain?
Or is it likely to lead to errors/deadlocks in other parts of the code?

Those questions have to do with the dune core code rather than the dumux code. So I suppose I should then contact directly the dune mailing list?

Regards,
Mona

------------------------------------------------------------------------
*From:* DuMux <dumux-boun...@listserv.uni-stuttgart.de> on behalf of Christoph Grüninger <f...@grueninger.de>
*Sent:* Tuesday, October 24, 2023 4:12:28 PM
*To:* dumux@listserv.uni-stuttgart.de
*Subject:* Re: [DuMux] global cell index for 3D SPgrid with MPI
Hi Mona,
Dune 2.6 and DuMuX 3.0 are a bit outdated. Can you reproduce your issue
with the latest stable versions of DuMuX, Dune 2.9 and SPGrid 2.9, too?
I know, easier said then done.
Does this problem occur with YaspGrid and >7 cores?

I am not sure whether you are hitting a SPGrid issues or a DuMuX issue.

Bye
Christoph


Am 24.10.23 um 11:52 schrieb Giraud, Mona:
Dear DuMux team,
I use dumux 3.0 and dune 2.6.
I encountered an issue when computing a problem using 3D SPgrid on ‘too many’ cores. Indeed, it looks like the cells’ ids according to Dune::GlobalIndexSet is incorrect when the core/cell ratio is too high. I set up a small example using an adapted version of 1st exercise of the dumux course (see zip folder attached).


##############  changes made to 'exercise_basic_2pni_solution' (from DuMux 3.0):


In injection2pniproblem.hh:

#include <dune/grid/spgrid.hh>
[…]
// Set the grid type
template<class TypeTag>
struct Grid<TypeTag, TTag::Injection2pNITypeTag> { using type = Dune::SPGrid<GetPropType<TypeTag, Properties::Scalar>, 3>; };
####
####
In exercise_basic_2pni.cc:
// getDofIndices, getPointIndices, getCellIndices
#include <dune/grid/utility/globalindexset.hh>

[...]
int main(int argc, char** argv) try
{
      using namespace Dumux;

      // define the type tag for this problem
      using TypeTag = Properties::TTag::Injection2pNICC;

      // initialize MPI, finalize is done automatically on exit
      const auto& mpiHelper = Dune::MPIHelper::instance(argc, argv);

      // print dumux start message
      if (mpiHelper.rank() == 0)
          DumuxMessage::print(/*firstCall=*/true);

      // parse command line arguments and input file
      Parameters::init(argc, argv);

      // try to create a grid (from the given grid file or the input file)
      GridManager<GetPropType<TypeTag, Properties::Grid>> gridManager;
      gridManager.init();

      ////////////////////////////////////////////////////////////
      // run instationary non-linear problem on this grid
      ////////////////////////////////////////////////////////////

      // we compute on the leaf grid view
      const auto& leafGridView = gridManager.grid().leafGridView();

      // create the finite volume grid geometry
      using FVGridGeometry = GetPropType<TypeTag, Properties::FVGridGeometry>;
      using SoilGridType = GetPropType<TypeTag, Properties::Grid>;
using GridView = typename SoilGridType::Traits::LeafGridView;
enum {
dimWorld = GridView::dimensionworld
};
      auto fvGridGeometry = std::make_shared<FVGridGeometry>(leafGridView);
      fvGridGeometry->update();

auto cellIdx = std::make_shared<Dune::GlobalIndexSet<GridView>>(gridManager.grid().leafGridView(), 0); std::cout<<"getCellCenters rank:"<<mpiHelper.rank()<<" dimWorld:"<<dimWorld <<"\n";
          for (const auto& e : elements(fvGridGeometry->gridView())) {
              auto p = e.geometry().center();
int gIdx = cellIdx->index(e);//get the cell global index
std::cout<<"gIdx "<<gIdx;
              for (int i=0; i< 3 ; i++) { // print cell center coordinates
std::cout<<", "<<i<<":"<<p[i];
              }std::cout<<std::endl;
          }

      ////////////////////////////////////////////////////////////
      // finalize, print dumux message to say goodbye
      ////////////////////////////////////////////////////////////
      // print dumux end message
      if (mpiHelper.rank() == 0)
      {
          Parameters::print();
          DumuxMessage::print(/*firstCall=*/false);
      }

      return 0;
} // end main

####
####
   in exercise_basic_2pni_solution.input:
[Grid]
LowerLeft = 0 0 0
UpperRight = 10 10 10
Cells = 2 2 2

############## Outputs:


# mpirun -n 1 ./exercise_basic_2pni_solution
getCellCenters rank:0 dimWorld:3
          gIdx 0, 0:2.5, 1:2.5, 2:2.5
          gIdx 1, 0:7.5, 1:2.5, 2:2.5
          gIdx 2, 0:2.5, 1:7.5, 2:2.5
          gIdx 3, 0:7.5, 1:7.5, 2:2.5
          gIdx 4, 0:2.5, 1:2.5, 2:7.5
          gIdx 5, 0:7.5, 1:2.5, 2:7.5
          gIdx 6, 0:2.5, 1:7.5, 2:7.5
          gIdx 7, 0:7.5, 1:7.5, 2:7.5

# mpirun -n 4 ./exercise_basic_2pni_solution
getCellCenters rank:1 dimWorld:3
          gIdx 0, 0:2.5, 1:2.5, 2:2.5
          gIdx 4, 0:7.5, 1:2.5, 2:2.5
          gIdx 2, 0:2.5, 1:7.5, 2:2.5
          gIdx 6, 0:7.5, 1:7.5, 2:2.5 // != gIdx 6, 0:2.5, 1:7.5, 2:7.5 obtained when using 1 core
          gIdx 1, 0:2.5, 1:2.5, 2:7.5
          gIdx 5, 0:7.5, 1:2.5, 2:7.5
          gIdx 3, 0:2.5, 1:7.5, 2:7.5
          gIdx 7, 0:7.5, 1:7.5, 2:7.5
getCellCenters rank:2 dimWorld:3
          gIdx 0, 0:2.5, 1:2.5, 2:2.5
          gIdx 4, 0:7.5, 1:2.5, 2:2.5
          gIdx 2, 0:2.5, 1:7.5, 2:2.5
          gIdx 6, 0:7.5, 1:7.5, 2:2.5
          gIdx 1, 0:2.5, 1:2.5, 2:7.5
          gIdx 5, 0:7.5, 1:2.5, 2:7.5
          gIdx 3, 0:2.5, 1:7.5, 2:7.5
          gIdx 7, 0:7.5, 1:7.5, 2:7.5
getCellCenters rank:3 dimWorld:3
          gIdx 0, 0:2.5, 1:2.5, 2:2.5
          gIdx 4, 0:7.5, 1:2.5, 2:2.5
          gIdx 2, 0:2.5, 1:7.5, 2:2.5
          gIdx 6, 0:7.5, 1:7.5, 2:2.5
          gIdx 1, 0:2.5, 1:2.5, 2:7.5
          gIdx 5, 0:7.5, 1:2.5, 2:7.5
          gIdx 3, 0:2.5, 1:7.5, 2:7.5
          gIdx 7, 0:7.5, 1:7.5, 2:7.5

##############


As can be seen, when 4 cores are used, the global cell index does not follow the 'default' dumux ordering (from the bottom left to the top right), although the local cell index still seems to follow it. The problem seems to disappear when the number of cells is higher. E.g., with :

[Grid]
LowerLeft = 0 0 0
UpperRight = 100 100 100
Cells = 5 5 20

Works again for a number of cores <= 7.

Would you know how to fix this issue? Indeed, in our simulation we would need to use a higher number of cores (> 7) with a rather small soil grid (~ 500 cells).
Many thanks for your help!
Best,
Mona
_______________________________________________
DuMux mailing list
DuMux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux <https://listserv.uni-stuttgart.de/mailman/listinfo/dumux>


------------------------------------------------------------------------------------------------
------------------------------------------------------------------------------------------------
Forschungszentrum Jülich GmbH
52425 Jülich
Sitz der Gesellschaft: Jülich
Eingetragen im Handelsregister des Amtsgerichts Düren Nr. HR B 3498
Vorsitzender des Aufsichtsrats: MinDir Stefan Müller
Geschäftsführung: Prof. Dr. Astrid Lambrecht (Vorsitzende),
Karsten Beneke (stellv. Vorsitzender), Dr. Ir. Pieter Jansens
------------------------------------------------------------------------------------------------
------------------------------------------------------------------------------------------------


------------------------------------------------------------------------------------------------
------------------------------------------------------------------------------------------------
Forschungszentrum Jülich GmbH
52425 Jülich
Sitz der Gesellschaft: Jülich
Eingetragen im Handelsregister des Amtsgerichts Düren Nr. HR B 3498
Vorsitzender des Aufsichtsrats: MinDir Stefan Müller
Geschäftsführung: Prof. Dr. Astrid Lambrecht (Vorsitzende),
Karsten Beneke (stellv. Vorsitzender), Dr. Ir. Pieter Jansens
------------------------------------------------------------------------------------------------
------------------------------------------------------------------------------------------------

_______________________________________________
DuMux mailing list
DuMux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux

--
L'enjeu est de bâtir la France de nos enfants,
pas de ressasser la France de notre enfance.
                       [Emmanuel Macron, 2022]
_______________________________________________
DuMux mailing list
DuMux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux

Reply via email to