Re: AW: [deal.II] step-42 clarification

2018-01-26 Thread Alberto Salvadori
Thank you, Timo. Your remarks have been very useful.
It turned out that I made a mistake in the way the mesh was prepared,
 specifically some hanging nodes were not properly dealt with.
This caused also a related issue, that I shared here some time ago (Nov.
25).

This leads to another question, that I take the opportunity to ask. Suppose
that a run is too long for a time slot allocated on a large scale computer.
In such a case, one wants to restart the computations
from a given time. To this aim, the history of the computations up to a
certain time shall be stored. All data stored in cell->user_pointer(), data
of the loads and mesh.
How can one store information on the latter, - as for the partition and the
hanging nodes - correctly? I understand that saving the mesh in a "ucd" or
alternative form may not be the right strategy.

Thank you very much.
Alberto








*Alberto Salvadori* Dipartimento di Ingegneria Civile, Architettura,
Territorio, Ambiente e di Matematica (DICATAM)
 Universita` di Brescia, via Branze 43, 25123 Brescia
 Italy
 tel 030 3711239
 fax 030 3711312

e-mail:
 alberto.salvad...@unibs.it
web-pages:
 http://m4lab.unibs.it/faculty.html
 http://dicata.ing.unibs.it/salvadori

On Fri, Jan 19, 2018 at 3:39 PM, Timo Heister  wrote:

> > in the code and re-implemented it. In serial version, all works fine so
> far.
> > However, when running in parallel, I am seeing an issue in the method
> > PlasticityContactProblem::update_solution_and_constraints.
> >
> > In particular, it turns out that the value of
> >
> > const unsigned int index_z = dof_indices[q_point];
> >
> > might be out of the range of
>
> If you do a loop over all locally owned and locally relevant cells
> than all dof values of a ghosted vector should exist. If you see an
> error, something else must be incorrect (like the IndexSets).
>
> >   PETScWrappers::MPI::Vector lambda( this->locally_relevant_dofs,
> this->mpi_communicator);
>
> This looks suspicious. Does this really create a ghosted vector in
> PETSc? I thought this would fail (at least in debug mode).
>
> Finally, it looks like you modified it to only look at locally owned
> cells to build constraints. The problem with this is that processors
> also need to know about constraints on ghost cells, not only locally
> owned cells. You no longer compute them, which means the solution
> might become incorrect around processor boundaries. It probably
> (hopefully?) works without adaptivity because each locally owned DoF
> is within at least one locally owned cell, but imagine a case where a
> dof on a ghost cells is constrained and interacts with a hanging node
> the current processor owns. You will not handle this case correctly.
>
> I don't quite remember if there is an easy way to do this, but I
> remember writing a debug function that checks if a ConstraintMatrix is
> consistent in parallel. This was a while back, but I can try to find
> it.
>
> --
> Timo Heister
> http://www.math.clemson.edu/~heister/
>
> --
> The deal.II project is located at http://www.dealii.org/
> For mailing list/forum options, see https://groups.google.com/d/
> forum/dealii?hl=en
> ---
> You received this message because you are subscribed to the Google Groups
> "deal.II User Group" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to dealii+unsubscr...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.
>

-- 

Informativa sulla Privacy: http://www.unibs.it/node/8155

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [deal.II] Re: installation error

2018-01-26 Thread Wolfgang Bangerth

On 01/26/2018 06:59 AM, Juan Carlos Araujo Cabarcas wrote:


 From the dealii detailed.log I see that:

MPI_VERSION = 2.1
OMPI_VERSION = 1.6.5

 From the terminal:

$ mpicc --version
gcc (Ubuntu 4.8.4-2ubuntu1~14.04.3) 4.8.4

So it seems I need to upgrade to MPI 3.0.
Thanks, for having a look at this.


That's one option. You could also try to modify the place in deal.II where we 
use MPI_Comm_create_group to use MPI_Comm_create instead. The difference 
between the two functions is minor, and may be useful to keep our interface at 
MPI 2.x. Any patch would of course be appreciated!


Best
 W.

--

Wolfgang Bangerth  email: bange...@colostate.edu
   www: http://www.math.colostate.edu/~bangerth/

--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups "deal.II User Group" group.

To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [deal.II] Re: installation error

2018-01-26 Thread Juan Carlos Araujo Cabarcas
Hi, when I install PETSc I allow the configuration process to download the 
necessary libraries it needs: 

./config/configure.py --with-shared=1 --with-x=0 --with-mpi=1 
--with-scalar-type=complex --download-mumps --download-metis 
--download-parmetis --download-superlu_dist --download-blacs 
--download-scalapack --download-superlu_dist

>From the dealii detailed.log I see that:

MPI_VERSION = 2.1
OMPI_VERSION = 1.6.5

>From the terminal:

$ mpicc --version
gcc (Ubuntu 4.8.4-2ubuntu1~14.04.3) 4.8.4

So it seems I need to upgrade to MPI 3.0.
Thanks, for having a look at this.

El jueves, 25 de enero de 2018, 12:29:53 (UTC-5), Wolfgang Bangerth 
escribió:
>
> On 01/24/2018 07:34 AM, Juan Carlos Araujo Cabarcas wrote: 
> > Please find the file: detailed.log attached. 
> > 
> > El martes, 23 de enero de 2018, 17:02:14 (UTC-5), Wolfgang Bangerth 
> > escribió: 
> > 
> > On 01/23/2018 02:13 PM, Bruno Turcksin wrote: 
> >  > 
> >  > mypath/dealii/source/lac/scalapack.cc:243:91: error: there 
> > are no 
> >  > arguments to ‘MPI_Comm_create_group’ that depend on a 
> > template parameter, 
> >  > so a declaration of ‘MPI_Comm_create_group’ must be available 
> > [-fpermissive] 
> >  > ierr = MPI_Comm_create_group(MPI_COMM_WORLD, 
> > group_union, 5, 
> >  > _communicator_union); 
>
> It confused me that you are compiling the scalapack file but get an 
> error message that a particular MPI function was not found. This would 
> ordinarily suggest that either scalapack.cc is missing an #include of 
>  (which is not the case) or that your installation is not 
> configured with MPI (which is the case for you). So the error did not 
> make sense to me at first. 
>
> But it turns out that the call in question, MPI_Comm_create_group is a 
> function that was only added to MPI in version 3.0. Apparently all of us 
> use an MPI installation that is sufficiently up to date. What is the 
> version you use? 
>
> Best 
>   W. 
>
> -- 
>  
> Wolfgang Bangerth  email: bang...@colostate.edu 
>  
> www: http://www.math.colostate.edu/~bangerth/ 
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [deal.II] Re: installation error

2018-01-26 Thread Juan Carlos Araujo Cabarcas
Hi, when I install PETSc I allow the configuration process to download the 
necessary libraries it needs: 

./config/configure.py --with-shared=1 --with-x=0 --with-mpi=1 
--with-scalar-type=complex --download-mumps --download-metis 
--download-parmetis --download-superlu_dist --download-blacs 
--download-scalapack --download-superlu_dist

>From the dealii detailed.log I see that:

MPI_VERSION = 2.1
OMPI_VERSION = 1.6.5

>From the terminal:

$ mpicc --version
gcc (Ubuntu 4.8.4-2ubuntu1~14.04.3) 4.8.4

Shall I update my system libraries?

El jueves, 25 de enero de 2018, 12:29:53 (UTC-5), Wolfgang Bangerth 
escribió:
>
> On 01/24/2018 07:34 AM, Juan Carlos Araujo Cabarcas wrote: 
> > Please find the file: detailed.log attached. 
> > 
> > El martes, 23 de enero de 2018, 17:02:14 (UTC-5), Wolfgang Bangerth 
> > escribió: 
> > 
> > On 01/23/2018 02:13 PM, Bruno Turcksin wrote: 
> >  > 
> >  > mypath/dealii/source/lac/scalapack.cc:243:91: error: there 
> > are no 
> >  > arguments to ‘MPI_Comm_create_group’ that depend on a 
> > template parameter, 
> >  > so a declaration of ‘MPI_Comm_create_group’ must be available 
> > [-fpermissive] 
> >  > ierr = MPI_Comm_create_group(MPI_COMM_WORLD, 
> > group_union, 5, 
> >  > _communicator_union); 
>
> It confused me that you are compiling the scalapack file but get an 
> error message that a particular MPI function was not found. This would 
> ordinarily suggest that either scalapack.cc is missing an #include of 
>  (which is not the case) or that your installation is not 
> configured with MPI (which is the case for you). So the error did not 
> make sense to me at first. 
>
> But it turns out that the call in question, MPI_Comm_create_group is a 
> function that was only added to MPI in version 3.0. Apparently all of us 
> use an MPI installation that is sufficiently up to date. What is the 
> version you use? 
>
> Best 
>   W. 
>
> -- 
>  
> Wolfgang Bangerth  email: bang...@colostate.edu 
>  
> www: http://www.math.colostate.edu/~bangerth/ 
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.