Hi Tri Dat,
1/ Is the manual invoking of loadBalance method only for
UGGrid/SPGrid.. ? Because, when using ALUGrid for 3D simulation and by
changing the options in alugrid.cfg file, we can choose graph
partitioning method. Please let me know if it's not right.
My comment holds for both UGGrid and ALUGrid. Loadbalancing is called
automatically by the Dumux GridCreators. Alternatively, you would have
to read in the sequential grid and invoke it manually. In both cases,
ALUGrid looks into the .cfg file.
2/ Concerning the linear solver, i have also tested AMG BackEnd with
SuperLU as coarse grid solver on different grid types (UGGrid,
YaspGrid and SPGrid for 2D simulations and ALUGrid in 3D cases) in
parallel mode. My problem is classical: two-phase immisble and
incompressible flow in a heterogeneous media (as in the attached
image) with impervious horizontal barriers. I used 2p model and fixed
the densities/viscosities of fluids constant. A lighter fluid is
injected at the bottom of media and let rise. It works properly in
sequential mode but in the parallel mode, i met convergence problem:
"breakdown in BiCGSTAB" and "Coarse solver did not converge". Please
have a look on the ouput file. I don't know how to fix it.
A first try would be to unset those two features which have not been
extensively tested in parallel so far:
SET_BOOL_PROP(TutorialProblemCoupled, EnablePartialReassemble, false);
SET_BOOL_PROP(TutorialProblemCoupled, EnableJacobianRecycling, false);
Kind regards
Bernd
Any hints, tips and advices would be greatly appreciated!
I apologize if my questions are too detailed.
Kind regards
Tri Dat
2014-12-04 9:48 GMT+01:00 Bernd Flemisch <[email protected]
<mailto:[email protected]>>:
Hi Tri Dat,
welcome to the Dumux mailing list!
If it is possible for you, I would recommend that you provide a
full DGF file that contains the permeability values. These should
automatically be distributed in a parallel setting. You can find
an example in dumux/test/implicit/co2, especially have a look at
the grid file and the spatial parameters file there. More
information on the DGF format is in the Dune docu at
http://www.dune-project.org/doc/doxygen/dune-grid-html/group___dune_grid_format_parser.html
You can associate parameters with vertices and elements. If you
have element parameters and want to run in parallel, you have to
use ALUGrid and 3d. If you have only vertex parameters, you can
also use UGGrid in 2d and 3d.
If you are able to adapt your workflow to this, parallelism
basically comes for free. The only thing that you have to be
careful about is the linear solver. In principle, we recommend the
AMGBackend, see test/implicit/1p for an example.
Otherwise, there would be the possibility to invoke manually the
loadBalance method of the grid with a data handle. You can have a
look at dune-grid/dune/grid/test/test-parallel-ug.cc for an
example. This would be more cumbersome.
Kind regards
Bernd
On 12/03/2014 06:10 PM, Tri Dat NGO wrote:
Dear Dumux developers,
I have a problem with random permeability simulations in parallel
mode.
I’ve performed a test case similar to "tutorialproblem" with a
random permeability media/./ The intrinsic permeability field is
provided by a data file (here permeab.dat).
It works correctly for sequential simulations. My domain
(2D,isotropic) consists of 128000 cells (160 cells in X and 800
cells in Y direction) corresponding to 128961 vertices.
Here is the code to distribute the permeability for cells:
###################
const Dune::FieldMatrix<Scalar, dim, dim>
intrinsicPermeability(const Element &element,
/*@\label{tutorial-coupled:permeability}@*/
const FVElementGeometry &fvGeometry,
const int scvIdx) const
{
Dune::FieldMatrix<Scalar, dim, dim> permLoc_;
Scalar permE = perm_[indexSet_.index(*(element.template
subEntity<dim> (scvIdx)))];
for (int i = 0; i < dim; i++){
permLoc_[i][i] = permE;
}
return permLoc_;
}
###################
where perm_ (initialed by perm_(gridView.size(dim),0.0)) is read
from 2x128000 array of the file permeab.dat.
Now, I try to test it in parallel mode shared by 4 processors.
The code will use local index and "local grid" (gridView.size
returns the size of the grid occupied by each processor) to read
data from data file, so only about ~1/4*128000=32000 first values
of permeab.dat are read.
Therefore, 4 sub-domains have the same permeability field, it's
not correct because it leads to a bad description of permeability
field of the whole domain.
I guess that I must use ParallelIndexSet but since I am a new
DUNE/Dumux user, it's difficult for me to find the solution.
Do you have any documentations or examples concerning this problem?
Please find attached the tutorialspatialparams file.
Thank you in advance for your helps,
Regards,
Tri Dat
_______________________________________________
Dumux mailing list
[email protected] <mailto:[email protected]>
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux
--
_______________________________________________________________
Bernd Flemisch phone:+49 711 685 69162
<tel:%2B49%20711%20685%2069162>
IWS, Universität Stuttgart fax:+49 711 685 60430
<tel:%2B49%20711%20685%2060430>
Pfaffenwaldring 61 email:[email protected]
<mailto:[email protected]>
D-70569 Stuttgart url:www.hydrosys.uni-stuttgart.de
<http://www.hydrosys.uni-stuttgart.de>
_______________________________________________________________
_______________________________________________
Dumux mailing list
[email protected]
<mailto:[email protected]>
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux
_______________________________________________
Dumux mailing list
[email protected]
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux
--
_______________________________________________________________
Bernd Flemisch phone: +49 711 685 69162
IWS, Universität Stuttgart fax: +49 711 685 60430
Pfaffenwaldring 61 email: [email protected]
D-70569 Stuttgart url: www.hydrosys.uni-stuttgart.de
_______________________________________________________________
_______________________________________________
Dumux mailing list
[email protected]
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux