Hi Mustapha,
if I try running the pamgtest
(dune-istl/<build-cmake>/dune/istl/paamg/test/pamgtest) with valgrind I get
very similar memory leaks. I can't confirm, though, that they are
different depending on the smoother (ILU0, SSOR)
of the AMG preconditioner. The test is a good starting point for a
minimal example.
If you can confirm the leaks there, please also report on the Dune
Mailing List ( d...@dune-project.org <mailto:d...@dune-project.org>).
Note that some of the leaks could also be related to false positive with
valgrind (http://valgrind.org/docs/manual/mc-manual.html#mc-manual.mpiwrap)
during debugging with mpi parallel programs.
Kind regards
Timo
On 08.02.2017 13:06, Bernd Flemisch wrote:
Hi Mustapha,
as Christoph says, we should try to make the example as minimal as
possible.
On 02/08/2017 09:56 AM, Mustapha El Ossmani wrote:
We have run a simple test (co2 model in DUMUX)in some iterations with
valgrind and we obtained the messages that can be found in the
enclosed filewhich confirms that there is a loss of memory.
Is this the result of a sequential run? If no, can you also post that?
Is it the result of an unmodified test from
test/porousmediumflow/co2/implicit? If no, do you see the same problem
for such an unmodified test? If again no, can you share the
modifications which you did that trigger the behavior?
Kind regards
Bernd
Have you ever met this issue ?
Best regards
M. El Ossmani
Le 07/02/2017 à 06:22, Christoph Grüninger a écrit :
Hi Mustapha,
I have never seen such problems but I am no solver expert. What you
can do:
* Do see this problem only in parallel or also in sequential code?
Only with AMG or also with GMRES and ILU?
* Reduce the problem as much as you can. The best would be a minimal
piece of software that only depends on Dune-istl that reads in your
matrix and repeatedly solves your linear system with the seen
undesired memory consumption. Not sure whether this is possible and
whether such minimal setup would still have the problem at all. When
you reduce your current problem, the problem might vanish, too. This
can help to find the cause of your issue.
* Analyze the problem with Valgrind or AddressSanitizer. Having it
reduced might be beneficial.
* Turning on all compiler warnings and carefully evaluating them
might help. But there are false positives but it can help.
* Maybe its worth repeating your question at the Dune mailing list
d...@dune-project.org <mailto:d...@dune-project.org> as there are
more Users and developers of istl.
Bye,
Christoph
Am 03.02.2017 um 11:14 schrieb Mustapha El Ossmani
<m.elossm...@ensam.umi.ac.ma <mailto:m.elossm...@ensam.umi.ac.ma>>:
Dear DuMu^X developers,
We are performing parallel computation with AMG solver. Due to
problems of convergence in the newton's method, in amgproperties.hh
we set the Preconditioner from Dune::SeqSSOR to Dune::SeqILU0 :
typedef
Dune::BlockPreconditioner<VType,VType,Comm,Dune::SeqSSOR<MType,VType,
VType> > Smoother;
// typedef
Dune::BlockPreconditioner<VType,VType,Comm,Dune::SeqILU0<MType,VType,
VType> > Smoother;
It seems that there is some problems of memory loss with ILU0
preconditionner. Indeed we can see that the memory of computation
is continually
increasing, until the computation stops with the following error
message :
Solve: M deltax^k = rslurmstepd: Job 936902 exceeded memory limit
(41146808 > 41058304), being killed
slurmstepd: Exceeded job memory limit
We can notice that this problem does not occur with SSOR as
Preconditionner.
Have you ever met this issue ?
Best regards
M. El Ossmani
University of Pau
_______________________________________________
Dumux mailing list
Dumux@listserv.uni-stuttgart.de
<mailto:Dumux@listserv.uni-stuttgart.de>
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux
_______________________________________________
Dumux mailing list
Dumux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux
_______________________________________________
Dumux mailing list
Dumux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux
--
_______________________________________________________________
Bernd Flemisch phone: +49 711 685 69162
IWS, Universität Stuttgart fax: +49 711 685 60430
Pfaffenwaldring 61 email:be...@iws.uni-stuttgart.de
D-70569 Stuttgart url:www.hydrosys.uni-stuttgart.de
_______________________________________________________________
_______________________________________________
Dumux mailing list
Dumux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux
--
____________________________________________________________________
Timo Koch phone: +49 711 685 64676
IWS, Universität Stuttgart fax: +49 711 685 60430
Pfaffenwaldring 61 email: timo.k...@iws.uni-stuttgart.de
D-70569 Stuttgart url: www.hydrosys.uni-stuttgart.de
____________________________________________________________________
_______________________________________________
Dumux mailing list
Dumux@listserv.uni-stuttgart.de
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux