Hi Tri Dat,

welcome to the Dumux mailing list!

If it is possible for you, I would recommend that you provide a full DGF file that contains the permeability values. These should automatically be distributed in a parallel setting. You can find an example in dumux/test/implicit/co2, especially have a look at the grid file and the spatial parameters file there. More information on the DGF format is in the Dune docu at
http://www.dune-project.org/doc/doxygen/dune-grid-html/group___dune_grid_format_parser.html

You can associate parameters with vertices and elements. If you have element parameters and want to run in parallel, you have to use ALUGrid and 3d. If you have only vertex parameters, you can also use UGGrid in 2d and 3d.

If you are able to adapt your workflow to this, parallelism basically comes for free. The only thing that you have to be careful about is the linear solver. In principle, we recommend the AMGBackend, see test/implicit/1p for an example.

Otherwise, there would be the possibility to invoke manually the loadBalance method of the grid with a data handle. You can have a look at dune-grid/dune/grid/test/test-parallel-ug.cc for an example. This would be more cumbersome.

Kind regards
Bernd

On 12/03/2014 06:10 PM, Tri Dat NGO wrote:

Dear Dumux developers,

I have a problem with random permeability simulations in parallel mode.
I’ve performed a test case similar to "tutorialproblem" with a random permeability media/./ The intrinsic permeability field is provided by a data file (here permeab.dat).


It works correctly for sequential simulations. My domain (2D,isotropic) consists of 128000 cells (160 cells in X and 800 cells in Y direction) corresponding to 128961 vertices.

Here is the code to distribute the permeability for cells:

###################

const Dune::FieldMatrix<Scalar, dim, dim> intrinsicPermeability(const Element &element, /*@\label{tutorial-coupled:permeability}@*/
const FVElementGeometry &fvGeometry,
const int scvIdx) const
    {
        Dune::FieldMatrix<Scalar, dim, dim> permLoc_;
Scalar permE = perm_[indexSet_.index(*(element.template subEntity<dim> (scvIdx)))];
        for (int i = 0; i < dim; i++){
            permLoc_[i][i] = permE;
        }
        return permLoc_;
     }
###################

where perm_ (initialed by perm_(gridView.size(dim),0.0)) is read from 2x128000 array of the file permeab.dat.

Now, I try to test it in parallel mode shared by 4 processors.
The code will use local index and "local grid" (gridView.size returns the size of the grid occupied by each processor) to read data from data file, so only about ~1/4*128000=32000 first values of permeab.dat are read. Therefore, 4 sub-domains have the same permeability field, it's not correct because it leads to a bad description of permeability field of the whole domain. I guess that I must use ParallelIndexSet but since I am a new DUNE/Dumux user, it's difficult for me to find the solution.

Do you have any  documentations or examples concerning this problem?

Please find attached the tutorialspatialparams file.
Thank you in advance for your helps,

Regards,
Tri Dat


_______________________________________________
Dumux mailing list
Dumux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux


--
_______________________________________________________________

Bernd Flemisch                         phone: +49 711 685 69162
IWS, Universität Stuttgart             fax:   +49 711 685 60430
Pfaffenwaldring 61            email: be...@iws.uni-stuttgart.de
D-70569 Stuttgart            url: www.hydrosys.uni-stuttgart.de
_______________________________________________________________

_______________________________________________
Dumux mailing list
Dumux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux

Reply via email to