Re: [DuMuX] Parallel run of 2p2c decoupled model

2015-05-27 Thread Bernd Flemisch

Hi Tri Dat,

it is much easier than I thought. I just forgot to specify an overlap in 
the dgf file. This is necessary for YaspGrid if the decoupled models are 
suppossed to run properly in parallel. Doing this right seems to give me 
the correct result for parallel runs.


You can try for yourself by adapting problem and input file such that 
the DgfGridCreator is used, see the attached patch. Together with an 
appropriate dgf file which is also attached.


Meanwhile I will fix the CubeGridCreator for YaspGrid so that it also 
works in parallel.


Kind regards
Bernd

On 05/22/2015 02:15 PM, Tri Dat NGO wrote:

Hi Martin and Bernd,

Please find attached the grid file I have been using for 3d2p 
decoupled adaptive + parallel.
I confirm you that the test_3d2p using mimetic method works fine in 
parallel.


Since I would like to run my 2p2c decoupled test cases in parallel, so 
I will be very happy while listening its progress. Please keep me 
informed.

Thank you once again for your help.

Kind regards,
Tri Dat

2015-05-22 13:36 GMT+02:00 Bernd Flemisch be...@iws.uni-stuttgart.de 
mailto:be...@iws.uni-stuttgart.de:


Hi Tri Dat,

I had a closer look at decoupled 2p2c in parallel. Two issues have
to be solved:

1. Apparently, our CubeGridCreator doesn't create a parallel
YaspGrid. This can be fixed easily. Until then, one can use the
default DgfGridCreator for YaspGrid and parallel.

2. In the decoupled 2p2c model, information is not transported
across the process boundaries. Since decoupled 2p and 2p2c share
quite a bit of the same infrastructure and 2p is parallel, this
also should be feasible in the close future.

Concerning decoupled 2p, I also did not succeed to run MPFAL in 3d
and parallel. The FV/TPFA works fine, also in the adaptive regime.
This needs to be further investigated.

Kind regards
Bernd

On Fri, 22 May 2015 10:59:24 +0200
 Tri Dat NGO trida...@gmail.com mailto:trida...@gmail.com wrote:
Hi Bernd,

Thank you so much for your help.
Please let me know if you have any progress on the decouple 2p2c in
parallel.

Concerning 2p decoupled  adaptive + parallel simulations, your
comments
lead me to run  *test_3d2p* in *dumux/test/decoupled/2p* in
parallel and I
obtained the following error message:

##
No model type specified
Default to finite volume MPFA l-method model
Dune reported error: Dune::NotImplemented

[storeBoundaryInteractionVolume:../../../dumux/decoupled/2p/diffusion/fvmpfa/lmethod/fvmpfal3dinteractionvolumecontainer.hh:2031]:
Boundary shape not implemented
##

It seems that there is a problem when storing the boundary
interaction
volumes in the *mpfa-lmethod*. My test domain dimension is
10x10x10 [m x m
x m] with the grid 20x20x20, all boundaries have id 1. I haven't tested yet
decoupled 2p - 3d parallel + adaptive using *mpfa-omethod/tpfa
method*.
Please let me know if you have any additional suggestions.

Kind regards,
Tri Dat

2015-05-21 12:40 GMT+02:00 Bernd Flemisch
be...@iws.uni-stuttgart.de mailto:be...@iws.uni-stuttgart.de:

  Hi Tri Dat,

 I just tried to run test_dec2p2c in parallel and it seems that
at least
 the output is wrong. While the pvd-file contains pointers to
correct
 parallel pvtu-file names, only sequential vtu-files are
written. I will
 investigate this further.

 In any case, to run in parallel, you need to switch the
LinearSolver to
 the AMGBackend in your problem file by adding

 #include dumux/linear/amgbackend.hh

 and adding/changing something like

 SET_TYPE_PROP(TestDecTwoPTwoCProblem, LinearSolver,
 Dumux::AMGBackendTypeTag);


 Decoupled 2p adaptive and parallel is possible as far as I
know. However
 the 2p adaptive stuff only works with ALUGrid and that means
that one has
 to use a 3d test case because 2d ALUGrid is not parallel. I
will try to set
 up a corresponding case.

 I assume that decoupled 2p2c adaptive and parallel is a larger
task. Since
 we would also like to have it, we can put it on our to-do list,
but it is
 hard to estimate when we actually can do it.

 Kind regards
 Bernd



 On 05/21/2015 11:51 AM, Tri Dat NGO wrote:

Dear DuMuX,

  I would like to know whether there is any test case of 2p2c
decoupled
 model which works correctly in parallel mode?
  I tried to run the parallel simulations of all examples in
 /dumux_v2.6/test/decoupled/2p2c with mpirun but I obtained
always the
 results of sequential simulations.

  Another question always on parallel simulation but concerning the
 adaptive grid refinement, 

Re: [DuMuX] Parallel run of 2p2c decoupled model

2015-05-27 Thread Tri Dat NGO
Hi Bernd,

Thank you for your help. It works for me. I think i will just take the
DgfGridCreator for parallel YaspGrid runs.

I have also another question about parallel run with implicit cell-centered
method of 2p model. When I try to run /dumux/test/implicit/2p/test_cc2p
(CubeGrid) in parallel, I obtained this error after the first time step:

[obelix2:30189] *** Process received signal ***
[obelix2:30189] Signal code:  (128)
[obelix2:30188] Failing at address: (nil)
[obelix2:30189] Signal: Segmentation fault (11)


Contrarily, ./test_cc2p with SimplexGrid and ./test_box2p work fine in
parallel.

Kind regards,
Tri Dat

2015-05-27 16:46 GMT+02:00 Bernd Flemisch be...@iws.uni-stuttgart.de:

  In fact, I will not fix the CubeGridCreator for parallel YaspGrid. Until
 and including Dune 2.3, the YaspGrid specialization of the
 StructuredGridFactory of Dune only creates a sequential YaspGrid. This is
 fixed in Dune 2.4 where the sequential and parallel YaspGrid constructors
 have been unified. Since Dune 2.4 is on its way and we will drop Dune 2.3
 support afterwards, it is too much hassle now to do it properly for both
 Dune 2.3 and 2.4.

 If you like, you can already move to the release branch of Dune 2.4 (at
 the expense of receiving a lot of deprecation warnings which we will fix
 after the release). Or you just take the DgfGridCreator for parallel
 YaspGrid runs.

 Bernd


 On 05/27/2015 03:54 PM, Bernd Flemisch wrote:

 Hi Tri Dat,

 it is much easier than I thought. I just forgot to specify an overlap in
 the dgf file. This is necessary for YaspGrid if the decoupled models are
 suppossed to run properly in parallel. Doing this right seems to give me
 the correct result for parallel runs.

 You can try for yourself by adapting problem and input file such that the
 DgfGridCreator is used, see the attached patch. Together with an
 appropriate dgf file which is also attached.

 Meanwhile I will fix the CubeGridCreator for YaspGrid so that it also
 works in parallel.

 Kind regards
 Bernd

 On 05/22/2015 02:15 PM, Tri Dat NGO wrote:

Hi Martin and Bernd,

  Please find attached the grid file I have been using for 3d2p decoupled
 adaptive + parallel.
  I confirm you that the test_3d2p using mimetic method works fine in
 parallel.

 Since I would like to run my 2p2c decoupled test cases in parallel, so I
 will be very happy while listening its progress. Please keep me informed.
 Thank you once again for your help.

  Kind regards,
  Tri Dat

 2015-05-22 13:36 GMT+02:00 Bernd Flemisch be...@iws.uni-stuttgart.de:

 Hi Tri Dat,

 I had a closer look at decoupled 2p2c in parallel. Two issues have to be
 solved:

 1. Apparently, our CubeGridCreator doesn't create a parallel YaspGrid.
 This can be fixed easily. Until then, one can use the default
 DgfGridCreator for YaspGrid and parallel.

 2. In the decoupled 2p2c model, information is not transported across the
 process boundaries. Since decoupled 2p and 2p2c share quite a bit of the
 same infrastructure and 2p is parallel, this also should be feasible in the
 close future.

 Concerning decoupled 2p, I also did not succeed to run MPFAL in 3d and
 parallel. The FV/TPFA works fine, also in the adaptive regime. This needs
 to be further investigated.

 Kind regards
 Bernd

 On Fri, 22 May 2015 10:59:24 +0200
  Tri Dat NGO trida...@gmail.com wrote:
 Hi Bernd,
 
 Thank you so much for your help.
 Please let me know if you have any progress on the decouple 2p2c in
 parallel.
 
 Concerning 2p decoupled  adaptive + parallel simulations, your comments
 lead me to run  *test_3d2p* in *dumux/test/decoupled/2p* in parallel and
 I
 obtained the following error message:
 
 ##
 No model type specified
 Default to finite volume MPFA l-method model
 Dune reported error: Dune::NotImplemented

 [storeBoundaryInteractionVolume:../../../dumux/decoupled/2p/diffusion/fvmpfa/lmethod/fvmpfal3dinteractionvolumecontainer.hh:2031]:
 Boundary shape not implemented
 ##
 
 It seems that there is a problem when storing the boundary interaction
 volumes in the *mpfa-lmethod*. My test domain dimension is 10x10x10 [m x
 m
 x m] with the grid 20x20x20, all boundaries have id 1. I haven't tested
 yet
 decoupled 2p - 3d parallel + adaptive using  *mpfa-omethod/tpfa method*.
  Please let me know if you have any additional suggestions.
 
 Kind regards,
 Tri Dat
 
 2015-05-21 12:40 GMT+02:00 Bernd Flemisch be...@iws.uni-stuttgart.de:
 
   Hi Tri Dat,
 
  I just tried to run test_dec2p2c in parallel and it seems that at least
  the output is wrong. While the pvd-file contains pointers to correct
  parallel pvtu-file names, only sequential vtu-files are written. I will
  investigate this further.
 
  In any case, to run in parallel, you need to switch the LinearSolver to
  the AMGBackend in your problem file by adding
 
  #include dumux/linear/amgbackend.hh
 
  and adding/changing something 

Re: [DuMuX] Parallel run of 2p2c decoupled model

2015-05-27 Thread Bernd Flemisch
In fact, I will not fix the CubeGridCreator for parallel YaspGrid. Until 
and including Dune 2.3, the YaspGrid specialization of the 
StructuredGridFactory of Dune only creates a sequential YaspGrid. This 
is fixed in Dune 2.4 where the sequential and parallel YaspGrid 
constructors have been unified. Since Dune 2.4 is on its way and we will 
drop Dune 2.3 support afterwards, it is too much hassle now to do it 
properly for both Dune 2.3 and 2.4.


If you like, you can already move to the release branch of Dune 2.4 (at 
the expense of receiving a lot of deprecation warnings which we will fix 
after the release). Or you just take the DgfGridCreator for parallel 
YaspGrid runs.


Bernd

On 05/27/2015 03:54 PM, Bernd Flemisch wrote:

Hi Tri Dat,

it is much easier than I thought. I just forgot to specify an overlap 
in the dgf file. This is necessary for YaspGrid if the decoupled 
models are suppossed to run properly in parallel. Doing this right 
seems to give me the correct result for parallel runs.


You can try for yourself by adapting problem and input file such that 
the DgfGridCreator is used, see the attached patch. Together with an 
appropriate dgf file which is also attached.


Meanwhile I will fix the CubeGridCreator for YaspGrid so that it also 
works in parallel.


Kind regards
Bernd

On 05/22/2015 02:15 PM, Tri Dat NGO wrote:

Hi Martin and Bernd,

Please find attached the grid file I have been using for 3d2p 
decoupled adaptive + parallel.
I confirm you that the test_3d2p using mimetic method works fine in 
parallel.


Since I would like to run my 2p2c decoupled test cases in parallel, 
so I will be very happy while listening its progress. Please keep me 
informed.

Thank you once again for your help.

Kind regards,
Tri Dat

2015-05-22 13:36 GMT+02:00 Bernd Flemisch be...@iws.uni-stuttgart.de 
mailto:be...@iws.uni-stuttgart.de:


Hi Tri Dat,

I had a closer look at decoupled 2p2c in parallel. Two issues
have to be solved:

1. Apparently, our CubeGridCreator doesn't create a parallel
YaspGrid. This can be fixed easily. Until then, one can use the
default DgfGridCreator for YaspGrid and parallel.

2. In the decoupled 2p2c model, information is not transported
across the process boundaries. Since decoupled 2p and 2p2c share
quite a bit of the same infrastructure and 2p is parallel, this
also should be feasible in the close future.

Concerning decoupled 2p, I also did not succeed to run MPFAL in
3d and parallel. The FV/TPFA works fine, also in the adaptive
regime. This needs to be further investigated.

Kind regards
Bernd

On Fri, 22 May 2015 10:59:24 +0200
 Tri Dat NGO trida...@gmail.com mailto:trida...@gmail.com wrote:
Hi Bernd,

Thank you so much for your help.
Please let me know if you have any progress on the decouple 2p2c in
parallel.

Concerning 2p decoupled  adaptive + parallel simulations, your
comments
lead me to run  *test_3d2p* in *dumux/test/decoupled/2p* in
parallel and I
obtained the following error message:

##
No model type specified
Default to finite volume MPFA l-method model
Dune reported error: Dune::NotImplemented

[storeBoundaryInteractionVolume:../../../dumux/decoupled/2p/diffusion/fvmpfa/lmethod/fvmpfal3dinteractionvolumecontainer.hh:2031]:
Boundary shape not implemented
##

It seems that there is a problem when storing the boundary
interaction
volumes in the *mpfa-lmethod*. My test domain dimension is
10x10x10 [m x m
x m] with the grid 20x20x20, all boundaries have id 1. I haven't tested yet
decoupled 2p - 3d parallel + adaptive using *mpfa-omethod/tpfa
method*.
Please let me know if you have any additional suggestions.

Kind regards,
Tri Dat

2015-05-21 12:40 GMT+02:00 Bernd Flemisch
be...@iws.uni-stuttgart.de mailto:be...@iws.uni-stuttgart.de:

  Hi Tri Dat,

 I just tried to run test_dec2p2c in parallel and it seems that
at least
 the output is wrong. While the pvd-file contains pointers to
correct
 parallel pvtu-file names, only sequential vtu-files are
written. I will
 investigate this further.

 In any case, to run in parallel, you need to switch the
LinearSolver to
 the AMGBackend in your problem file by adding

 #include dumux/linear/amgbackend.hh

 and adding/changing something like

 SET_TYPE_PROP(TestDecTwoPTwoCProblem, LinearSolver,
 Dumux::AMGBackendTypeTag);


 Decoupled 2p adaptive and parallel is possible as far as I
know. However
 the 2p adaptive stuff only works with ALUGrid and that means
that one has
 to use a 3d test case because 2d ALUGrid is not parallel. I
will try to set
 up a corresponding case.

 I assume that 

Re: [DuMuX] Parallel run of 2p2c decoupled model

2015-05-22 Thread Tri Dat NGO
Hi Bernd,

Thank you so much for your help.
Please let me know if you have any progress on the decouple 2p2c in
parallel.

Concerning 2p decoupled  adaptive + parallel simulations, your comments
lead me to run  *test_3d2p* in *dumux/test/decoupled/2p* in parallel and I
obtained the following error message:

##
No model type specified
Default to finite volume MPFA l-method model
Dune reported error: Dune::NotImplemented
[storeBoundaryInteractionVolume:../../../dumux/decoupled/2p/diffusion/fvmpfa/lmethod/fvmpfal3dinteractionvolumecontainer.hh:2031]:
Boundary shape not implemented
##

It seems that there is a problem when storing the boundary interaction
volumes in the *mpfa-lmethod*. My test domain dimension is 10x10x10 [m x m
x m] with the grid 20x20x20, all boundaries have id 1. I haven't tested yet
decoupled 2p - 3d parallel + adaptive using  *mpfa-omethod/tpfa method*.
Please let me know if you have any additional suggestions.

Kind regards,
Tri Dat

2015-05-21 12:40 GMT+02:00 Bernd Flemisch be...@iws.uni-stuttgart.de:

  Hi Tri Dat,

 I just tried to run test_dec2p2c in parallel and it seems that at least
 the output is wrong. While the pvd-file contains pointers to correct
 parallel pvtu-file names, only sequential vtu-files are written. I will
 investigate this further.

 In any case, to run in parallel, you need to switch the LinearSolver to
 the AMGBackend in your problem file by adding

 #include dumux/linear/amgbackend.hh

 and adding/changing something like

 SET_TYPE_PROP(TestDecTwoPTwoCProblem, LinearSolver,
 Dumux::AMGBackendTypeTag);


 Decoupled 2p adaptive and parallel is possible as far as I know. However
 the 2p adaptive stuff only works with ALUGrid and that means that one has
 to use a 3d test case because 2d ALUGrid is not parallel. I will try to set
 up a corresponding case.

 I assume that decoupled 2p2c adaptive and parallel is a larger task. Since
 we would also like to have it, we can put it on our to-do list, but it is
 hard to estimate when we actually can do it.

 Kind regards
 Bernd



 On 05/21/2015 11:51 AM, Tri Dat NGO wrote:

Dear DuMuX,

  I would like to know whether there is any test case of 2p2c decoupled
 model which works correctly in parallel mode?
  I tried to run the parallel simulations of all examples in
 /dumux_v2.6/test/decoupled/2p2c with mpirun but I obtained always the
 results of sequential simulations.

  Another question always on parallel simulation but concerning the
 adaptive grid refinement, can we implement the adaptive grid method with
 2p/2p2c model in parallel mode?

  Thank you in advance for your reply.

  Kind regards,
  Tri Dat


 ___
 Dumux mailing 
 listdu...@listserv.uni-stuttgart.dehttps://listserv.uni-stuttgart.de/mailman/listinfo/dumux



 --
 ___

 Bernd Flemisch phone: +49 711 685 69162
 IWS, Universität Stuttgart fax:   +49 711 685 60430
 Pfaffenwaldring 61email: be...@iws.uni-stuttgart.de
 D-70569 Stuttgarturl: www.hydrosys.uni-stuttgart.de
 ___


 ___
 Dumux mailing list
 Dumux@listserv.uni-stuttgart.de
 https://listserv.uni-stuttgart.de/mailman/listinfo/dumux


___
Dumux mailing list
Dumux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux


Re: [DuMuX] Parallel run of 2p2c decoupled model

2015-05-22 Thread Bernd Flemisch
Hi Tri Dat,

I had a closer look at decoupled 2p2c in parallel. Two issues have to be solved:

1. Apparently, our CubeGridCreator doesn't create a parallel YaspGrid. This can 
be fixed easily. Until then, one can use the default DgfGridCreator for 
YaspGrid and parallel.

2. In the decoupled 2p2c model, information is not transported across the 
process boundaries. Since decoupled 2p and 2p2c share quite a bit of the same 
infrastructure and 2p is parallel, this also should be feasible in the close 
future.

Concerning decoupled 2p, I also did not succeed to run MPFAL in 3d and 
parallel. The FV/TPFA works fine, also in the adaptive regime. This needs to be 
further investigated.

Kind regards
Bernd

On Fri, 22 May 2015 10:59:24 +0200
 Tri Dat NGO trida...@gmail.com wrote:
Hi Bernd,

Thank you so much for your help.
Please let me know if you have any progress on the decouple 2p2c in
parallel.

Concerning 2p decoupled  adaptive + parallel simulations, your comments
lead me to run  *test_3d2p* in *dumux/test/decoupled/2p* in parallel and I
obtained the following error message:

##
No model type specified
Default to finite volume MPFA l-method model
Dune reported error: Dune::NotImplemented
[storeBoundaryInteractionVolume:../../../dumux/decoupled/2p/diffusion/fvmpfa/lmethod/fvmpfal3dinteractionvolumecontainer.hh:2031]:
Boundary shape not implemented
##

It seems that there is a problem when storing the boundary interaction
volumes in the *mpfa-lmethod*. My test domain dimension is 10x10x10 [m x m
x m] with the grid 20x20x20, all boundaries have id 1. I haven't tested yet
decoupled 2p - 3d parallel + adaptive using  *mpfa-omethod/tpfa method*.
Please let me know if you have any additional suggestions.

Kind regards,
Tri Dat

2015-05-21 12:40 GMT+02:00 Bernd Flemisch be...@iws.uni-stuttgart.de:

  Hi Tri Dat,

 I just tried to run test_dec2p2c in parallel and it seems that at least
 the output is wrong. While the pvd-file contains pointers to correct
 parallel pvtu-file names, only sequential vtu-files are written. I will
 investigate this further.

 In any case, to run in parallel, you need to switch the LinearSolver to
 the AMGBackend in your problem file by adding

 #include dumux/linear/amgbackend.hh

 and adding/changing something like

 SET_TYPE_PROP(TestDecTwoPTwoCProblem, LinearSolver,
 Dumux::AMGBackendTypeTag);


 Decoupled 2p adaptive and parallel is possible as far as I know. However
 the 2p adaptive stuff only works with ALUGrid and that means that one has
 to use a 3d test case because 2d ALUGrid is not parallel. I will try to set
 up a corresponding case.

 I assume that decoupled 2p2c adaptive and parallel is a larger task. Since
 we would also like to have it, we can put it on our to-do list, but it is
 hard to estimate when we actually can do it.

 Kind regards
 Bernd



 On 05/21/2015 11:51 AM, Tri Dat NGO wrote:

Dear DuMuX,

  I would like to know whether there is any test case of 2p2c decoupled
 model which works correctly in parallel mode?
  I tried to run the parallel simulations of all examples in
 /dumux_v2.6/test/decoupled/2p2c with mpirun but I obtained always the
 results of sequential simulations.

  Another question always on parallel simulation but concerning the
 adaptive grid refinement, can we implement the adaptive grid method with
 2p/2p2c model in parallel mode?

  Thank you in advance for your reply.

  Kind regards,
  Tri Dat


 ___
 Dumux mailing 
 listdu...@listserv.uni-stuttgart.dehttps://listserv.uni-stuttgart.de/mailman/listinfo/dumux



 --
 ___

 Bernd Flemisch phone: +49 711 685 69162
 IWS, Universität Stuttgart fax:   +49 711 685 60430
 Pfaffenwaldring 61email: be...@iws.uni-stuttgart.de
 D-70569 Stuttgarturl: www.hydrosys.uni-stuttgart.de
 ___


 ___
 Dumux mailing list
 Dumux@listserv.uni-stuttgart.de
 https://listserv.uni-stuttgart.de/mailman/listinfo/dumux



___

Bernd Flemisch phone: +49 711 685 69162
IWS, Universitaet Stuttgart  fax: +49 711 685 67020
Pfaffenwaldring 61email: be...@iws.uni-stuttgart.de
D-70569 Stuttgarturl: www.hydrosys.uni-stuttgart.de
___
___
Dumux mailing list
Dumux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux


Re: [DuMuX] Parallel run of 2p2c decoupled model

2015-05-22 Thread Martin Schneider

Hi Tri Dat,

probably the MPFA L method has problems with the grid you are using. 
There are some grid restrictions for the implemented MPFA methods.

Could you please send us your grid file.

You could try to run the test with the Mimetic discretization instead, 
by executing

test_2p3d  -ModelType Mimetic

Kind regards,
Martin

Am 22.05.2015 um 10:59 schrieb Tri Dat NGO:

Hi Bernd,

Thank you so much for your help.
Please let me know if you have any progress on the decouple 2p2c in 
parallel.


Concerning 2p decoupled  adaptive + parallel simulations, your 
comments lead me to run /test_3d2p/ in /dumux/test/decoupled/2p/ in 
parallel and I obtained the following error message:


##
No model type specified
Default to finite volume MPFA l-method model
Dune reported error: Dune::NotImplemented 
[storeBoundaryInteractionVolume:../../../dumux/decoupled/2p/diffusion/fvmpfa/lmethod/fvmpfal3dinteractionvolumecontainer.hh:2031]: 
Boundary shape not implemented

##

It seems that there is a problem when storing the boundary interaction 
volumes in the /mpfa-lmethod///. My test domain dimension is 10x10x10 
[m x m x m] with the grid 20x20x20, all boundaries have id 1. I 
haven't tested yet decoupled 2p - 3d parallel + adaptive using 
/mpfa-omethod/tpfa method/. Please let me know if you have any 
additional suggestions.


Kind regards,
Tri Dat

2015-05-21 12:40 GMT+02:00 Bernd Flemisch be...@iws.uni-stuttgart.de 
mailto:be...@iws.uni-stuttgart.de:


Hi Tri Dat,

I just tried to run test_dec2p2c in parallel and it seems that at
least the output is wrong. While the pvd-file contains pointers to
correct parallel pvtu-file names, only sequential vtu-files are
written. I will investigate this further.

In any case, to run in parallel, you need to switch the
LinearSolver to the AMGBackend in your problem file by adding

#include dumux/linear/amgbackend.hh

and adding/changing something like

SET_TYPE_PROP(TestDecTwoPTwoCProblem, LinearSolver,
Dumux::AMGBackendTypeTag);


Decoupled 2p adaptive and parallel is possible as far as I know.
However the 2p adaptive stuff only works with ALUGrid and that
means that one has to use a 3d test case because 2d ALUGrid is not
parallel. I will try to set up a corresponding case.

I assume that decoupled 2p2c adaptive and parallel is a larger
task. Since we would also like to have it, we can put it on our
to-do list, but it is hard to estimate when we actually can do it.

Kind regards
Bernd



On 05/21/2015 11:51 AM, Tri Dat NGO wrote:

Dear DuMuX,

I would like to know whether there is any test case of 2p2c
decoupled model which works correctly in parallel mode?
I tried to run the parallel simulations of all examples in
/dumux_v2.6/test/decoupled/2p2c with mpirun but I obtained always
the results of sequential simulations.

Another question always on parallel simulation but concerning the
adaptive grid refinement, can we implement the adaptive grid
method with 2p/2p2c model in parallel mode?

Thank you in advance for your reply.

Kind regards,
Tri Dat


___
Dumux mailing list
Dumux@listserv.uni-stuttgart.de  mailto:Dumux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux



-- 
___


Bernd Flemisch phone:+49 711 685 69162  
tel:%2B49%20711%20685%2069162
IWS, Universität Stuttgart fax:+49 711 685 60430  
tel:%2B49%20711%20685%2060430
Pfaffenwaldring 61email:be...@iws.uni-stuttgart.de  
mailto:be...@iws.uni-stuttgart.de
D-70569 Stuttgarturl:www.hydrosys.uni-stuttgart.de  
http://www.hydrosys.uni-stuttgart.de
___


___
Dumux mailing list
Dumux@listserv.uni-stuttgart.de
mailto:Dumux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux




___
Dumux mailing list
Dumux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux


___
Dumux mailing list
Dumux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux


[DuMuX] Parallel run of 2p2c decoupled model

2015-05-21 Thread Tri Dat NGO
Dear DuMuX,

I would like to know whether there is any test case of 2p2c decoupled model
which works correctly in parallel mode?
I tried to run the parallel simulations of all examples in
/dumux_v2.6/test/decoupled/2p2c with mpirun but I obtained always the
results of sequential simulations.

Another question always on parallel simulation but concerning the adaptive
grid refinement, can we implement the adaptive grid method with 2p/2p2c
model in parallel mode?

Thank you in advance for your reply.

Kind regards,
Tri Dat
___
Dumux mailing list
Dumux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux


Re: [DuMuX] Parallel run of 2p2c decoupled model

2015-05-21 Thread Bernd Flemisch

Hi Tri Dat,

I just tried to run test_dec2p2c in parallel and it seems that at least 
the output is wrong. While the pvd-file contains pointers to correct 
parallel pvtu-file names, only sequential vtu-files are written. I will 
investigate this further.


In any case, to run in parallel, you need to switch the LinearSolver to 
the AMGBackend in your problem file by adding


#include dumux/linear/amgbackend.hh

and adding/changing something like

SET_TYPE_PROP(TestDecTwoPTwoCProblem, LinearSolver, 
Dumux::AMGBackendTypeTag);



Decoupled 2p adaptive and parallel is possible as far as I know. However 
the 2p adaptive stuff only works with ALUGrid and that means that one 
has to use a 3d test case because 2d ALUGrid is not parallel. I will try 
to set up a corresponding case.


I assume that decoupled 2p2c adaptive and parallel is a larger task. 
Since we would also like to have it, we can put it on our to-do list, 
but it is hard to estimate when we actually can do it.


Kind regards
Bernd


On 05/21/2015 11:51 AM, Tri Dat NGO wrote:

Dear DuMuX,

I would like to know whether there is any test case of 2p2c decoupled 
model which works correctly in parallel mode?
I tried to run the parallel simulations of all examples in 
/dumux_v2.6/test/decoupled/2p2c with mpirun but I obtained always the 
results of sequential simulations.


Another question always on parallel simulation but concerning the 
adaptive grid refinement, can we implement the adaptive grid method 
with 2p/2p2c model in parallel mode?


Thank you in advance for your reply.

Kind regards,
Tri Dat


___
Dumux mailing list
Dumux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux



--
___

Bernd Flemisch phone: +49 711 685 69162
IWS, Universität Stuttgart fax:   +49 711 685 60430
Pfaffenwaldring 61email: be...@iws.uni-stuttgart.de
D-70569 Stuttgarturl: www.hydrosys.uni-stuttgart.de
___

___
Dumux mailing list
Dumux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux