Re: [deal.II] Memory loss in system solver

2020-07-28 Thread Alberto Salvadori
Dear Wolfgang,

thank you for your guidance, I did what you suggested. Basically all calls 
in the system matrix construction have been removed but integration of unit 
per each element, with zero rhs. To make the system solvable, (non-zero, 
but it should be non relevant) Dirichlet boundary conditions have been 
imposed on the whole boundary. I checked the memory consumption through 
top->RES without solving the linear system, no issues arose. Again, as soon 
as I invoke 

BiCG.solve (this->system_matrix, distributed_incremental_displacement, 
this->system_rhs, preconditioner);

top->RES shows memory leaks. Guided by Richard remark, I implemented the 
trilinos solver as well. In such a case, no leaks at all. Even building the 
system matrix in the complete form.
My guess is that there must be either some conflicts or installation 
related issues with PETSc. Note that the system solution was always OK, the 
concern was just about the memory loss.

I did check that no issue arise in running step-18 from the library 
examples. 

In conclusion, the problem must be there, sorry I was not able to identify 
it more properly. Maybe Richard can be more precise, and I am at disposal, 
of course.

Hope this helps. 

Alberto


Il giorno sabato 25 luglio 2020 alle 05:46:08 UTC+2 Wolfgang Bangerth ha 
scritto:

> On 7/24/20 3:32 AM, Alberto Salvadori wrote:
> > 
> > It turns out that this code produces a memory loss, quite significant 
> since I 
> > am solving my system thousands of times, eventually inducing the run to 
> fail. 
> > I am not sure what is causing this issue and how to solve it, maybe more 
> > experienced users than myself can catch the problem with a snap of 
> fingers.
> > 
> > I have verified the issue on my mac (Catalina) as well as on linux 
> ubuntu 
> > (4.15.0), using deal.ii 9.1.1.
> > Apparently the issue reveals only when mpi is invoked with more than one 
> > processor, whereas it does not emerge when running in serial or by 
> mpirun -np 1.
>
> Alberto -- I've taken a look at the SolverBicgstab class and don't see 
> anything glaringly obvious that would suggest where the memory is lost. 
> It's 
> also funny that that would only happen with more than one processor 
> because 
> the memory handling of PETSc vectors shouldn't be any different for one or 
> more processors.
>
> Do you think you could come up with a simple test case that illustrates 
> the 
> problem? In your case, I'd start with the code you have and remove 
> basically 
> everything you do: replace the assembly by a function that just fills the 
> matrix with the identity matrix (or something similarly simple), remove 
> everything that does anything useful with the solution, remove graphical 
> output, etc. The only thing that should remain is the loop that repeatedly 
> solves a linear system and illustrates the memory leak, but the program no 
> longer has to do anything useful (in fact, it probably shouldn't -- it 
> should 
> only exercise the one part you suspect of causing the memory leak).
>
> I think that would make finding the root cause substantially simpler!
>
> Best
> W.
>
> -- 
> 
> Wolfgang Bangerth email: bang...@colostate.edu
> www: http://www.math.colostate.edu/~bangerth/
>
>
-- 


Informativa sulla Privacy: http://www.unibs.it/node/8155 


-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/903450f4-1846-4979-978c-929f8c3167b9n%40googlegroups.com.


[deal.II] Memory loss in system solver

2020-07-25 Thread Richard Schussnig
Hi Alberto,
I might be having a similar or even the same problem with petsc! In my case, 
the memory accumulated is proportional to the number of iterations done in the 
SolverFGMRES solver. Also, when using trilinos (switch between petsc and 
trilinos see step 40 I believe), this does not(!) happen!
Please do report back, if you find anything - I did not look into it for now, 
but will do in a week or so.
Regards & good luck,
Richard

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/e0b379a9-dba7-4133-8ef0-ad308e680ffeo%40googlegroups.com.


Re: [deal.II] Memory loss in system solver

2020-07-24 Thread Wolfgang Bangerth

On 7/24/20 3:32 AM, Alberto Salvadori wrote:


It turns out that this code produces a memory loss, quite significant since I 
am solving my system thousands of times, eventually inducing the run to fail. 
I am not sure what is causing this issue and how to solve it, maybe more 
experienced users than myself can catch the problem with a snap of fingers.


I have verified the issue on my mac (Catalina) as well as on linux ubuntu 
(4.15.0), using deal.ii 9.1.1.
Apparently the issue reveals only when mpi is invoked with more than one 
processor, whereas it does not emerge when running in serial or by mpirun -np 1.


Alberto -- I've taken a look at the SolverBicgstab class and don't see 
anything glaringly obvious that would suggest where the memory is lost. It's 
also funny that that would only happen with more than one processor because 
the memory handling of PETSc vectors shouldn't be any different for one or 
more processors.


Do you think you could come up with a simple test case that illustrates the 
problem? In your case, I'd start with the code you have and remove basically 
everything you do: replace the assembly by a function that just fills the 
matrix with the identity matrix (or something similarly simple), remove 
everything that does anything useful with the solution, remove graphical 
output, etc. The only thing that should remain is the loop that repeatedly 
solves a linear system and illustrates the memory leak, but the program no 
longer has to do anything useful (in fact, it probably shouldn't -- it should 
only exercise the one part you suspect of causing the memory leak).


I think that would make finding the root cause substantially simpler!

Best
 W.

--

Wolfgang Bangerth  email: bange...@colostate.edu
   www: http://www.math.colostate.edu/~bangerth/

--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups "deal.II User Group" group.

To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/5a63f6c5-48b7-a40a-36c8-5e0ef5bed327%40colostate.edu.


[deal.II] Memory loss in system solver

2020-07-24 Thread Alberto Salvadori
Dear community

I have written the simple code below for solving a system using PETSc,
having defined 

Vector incremental_displacement;
Vector accumulated_displacement;

in the class LargeStrainMechanicalProblem_OneField.

It turns out that this code produces a memory loss, quite significant since 
I am solving my system thousands of times, eventually inducing the run to 
fail. I am not sure what is causing this issue and how to solve it, maybe 
more experienced users than myself can catch the problem with a snap of 
fingers. 

I have verified the issue on my mac (Catalina) as well as on linux ubuntu 
(4.15.0), using deal.ii 9.1.1.
Apparently the issue reveals only when mpi is invoked with more than one 
processor, whereas it does not emerge when running in serial or by mpirun 
-np 1.

Thanks in advance

Alberto

=




template  
unsigned int LargeStrainMechanicalProblem_OneField 
:: 
solve ( 
const unsigned penaltyAmplification 
) 

// 
// this simplified version of solve has been written to find out 
// the source of memory leak in parallel 
// 

{ 

PETScWrappers::MPI::Vector distributed_incremental_displacement 
(this>locally_owned_dofs,this->mpi_communicator); 

distributed_incremental_displacement = incremental_displacement; 

size_t 
bicgstab_max_iterations = 2 ; 

double 
tolerance = 1e-10 * this->system_rhs.l2_norm() ; 

unsigned solver_control_last_step; 

SolverControl bicgstab_solver_control ( bicgstab_max_iterations , tolerance 
); 

PETScWrappers::PreconditionJacobi preconditioner( this->system_matrix ); 

this->pcout << " Bicgstab " << std::flush ; 

PETScWrappers::SolverBicgstab BiCG (bicgstab_solver_control, 
this->mpi_communicator); 

BiCG.solve (this->system_matrix, distributed_incremental_displacement, 
this->system_rhs, preconditioner); 

solver_control_last_step = bicgstab_solver_control.last_step(); 

incremental_displacement = distributed_incremental_displacement; 
accumulated_displacement += incremental_displacement; 
this->hanging_node_constraints.distribute (accumulated_displacement); 

return solver_control_last_step; 

}

-- 


Informativa sulla Privacy: http://www.unibs.it/node/8155 


-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/a6abf4d8-ae28-4c74-a0be-3f47f046368an%40googlegroups.com.