Re: [deal.II] Re: mapping_collection and step-27

2017-12-05 Thread Wolfgang Bangerth

On 12/05/2017 05:47 PM, Juan Carlos Araujo Cabarcas wrote:


On my topic, I have done as usual:
 git clone https://github.com/dealii/dealii.git dealii
and proceeded with the configuration/installation of the library, to notice 
that I still have an error:


When did you clone the repository? The patch was merged earlier today. You can 
find out if you do

  git log
and look at which patches were merged most recently. Does #5579 show up?

It's of course also possible that I fixed one issue but not all.

Best
 W.


--

Wolfgang Bangerth  email: bange...@colostate.edu
   www: http://www.math.colostate.edu/~bangerth/

--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups "deal.II User Group" group.

To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [deal.II] Re: mapping_collection and step-27

2017-12-05 Thread Juan Carlos Araujo Cabarcas
Thanks for your email and the encouragement, I will keep these reporting 
issues for sure!
This thread was a nice lecture on templates!

On my topic, I have done as usual:
git clone https://github.com/dealii/dealii.git dealii
and proceeded with the configuration/installation of the library, to notice 
that I still have an error:

Linking CXX executable pFEM
/mypatch/pFEM.cc:370: error: undefined reference to 'void 
dealii::VectorTools::interpolate_boundary_values<2, 2, std::complex 
>(dealii::hp::MappingCollection<2, 2> const&, dealii::hp::DoFHandler<2, 2> 
const&, std::map > 
const*, std::less, std::allocator const*> > > const&, 
std::map, std::less, 
std::allocator > >&, 
dealii::ComponentMask const&)'
collect2: error: ld returned 1 exit status
make[2]: *** [pFEM] Error 1
make[1]: *** [CMakeFiles/pFEM.dir/all] Error 2
make: *** [all] Error 2

As if nothing had changed. So I have a few questions:
1) Is it the same error?
2) If 1) is true, then isn't https://github.com/dealii/dealii/pull/5579 

 
merged with https://github.com/dealii/dealii.git dealii?
3) If 2) is false, I would like to kindly ask how to apply the patch. I 
tried a few things I got from a GIT tutorial but seemed not to work.

Thanks in advance,
Juan Carlos Araújo, Umeå Universitet


El lunes, 4 de diciembre de 2017, 13:09:28 (UTC-5), Wolfgang Bangerth 
escribió:
>
> On 12/01/2017 08:36 PM, Juan Carlos Araujo Cabarcas wrote: 
> > 
> > path/petsc_eigs/pFEM.cc:370: error: undefined reference to 'void 
> > dealii::VectorTools::interpolate_boundary_values<2, 2, 
> > std::complex >(dealii::hp::MappingCollection<2, 2> const&, 
> > dealii::hp::DoFHandler<2, 2> const&, std::map > dealii::Function<2, std::complex > const*, std::less > int>, std::allocator std::complex > const*> > > const&, std::map > std::complex, std::less, 
> > std::allocator > 
> >  >&, dealii::ComponentMask const&)' 
>
> Now that the compiler is happy with your code, this is the first real 
> bug -- we don't instantiate this function. The patch is here: 
>https://github.com/dealii/dealii/pull/5579 
>
> I appreciate you trying all of these things with complex-valued vectors. 
> It's hard to find all of the relevant places that need to be changed and 
> fixed. Keep these issues coming! 
>
> Best 
>   W. 
>
> -- 
>  
> Wolfgang Bangerth  email: bang...@colostate.edu 
>  
> www: http://www.math.colostate.edu/~bangerth/ 
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[deal.II] Re: Some cells have negative volume error

2017-12-05 Thread Bruno Turcksin
Narendra,

On Tuesday, December 5, 2017 at 11:27:15 AM UTC-5, Narendra Nanal wrote:
>
>
> When the input file is imported in abaqus it is working fine. Please see 
> attached pictures.I am not sure if something is wrong with the mesh. Kindly 
> guide me on how to resolve this error. Thanks in advance.
>
You need to simplify the mesh as much as possible, i.e, decrease the number 
of cells, to understand what is going. Right now, there are way too many 
cells to find out which one is wrong.

Best,

Bruno

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [deal.II] Re: Crack propagation

2017-12-05 Thread Timo Heister
> If I use your user element, I have to use OPEN MPI? now I have some issues
> with Open MPI in Deal.ii

What do you mean by "user element"? The example code in question
requires deal.II to be configured with MPI. What vendor you use
(OpenMPI, MPICH, ...) is up to you.

-- 
Timo Heister
http://www.math.clemson.edu/~heister/

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [deal.II] error in SolverFGMRES related destructor

2017-12-05 Thread SebG
Dear Timo, dear Wolfgang,

thank you for your replies. Lesson learned that ILU with the original 
sparsity pattern is not a good idea for the convection-diffusion block. I 
added the grad-div term such that I can just use the pressure mass matrix 
for the Schur complement.

Further, I modified my code to switch the solver for the 
convection-diffusion block between SparseUMFPACK and a GMRES solver with 
AMG preconditioning. The class PreconditionAMG from the TrilinosWrappers 
with a w-cycle and a Gauss-Seidel smoother (10 steps) is used right now.

It works quite well for very low Reynolds number flows. However, if the 
Reynolds number is increased, the GMRES solver requires more and more 
iterations, see files attached. My guess for the reason of this is that 
there is no stabilization like streamline-diffusion that prevents 
oscillations on coarse grids (see e.g. the book of Elman, Silvester and 
Wathen chapter 6 and 7).

Can you confirm my guess from your experiences or do I just need to tune 
the settings of the AMG preconditioner?

Best wishes,
Sebastian

Am Montag, 27. November 2017 17:41:37 UTC+1 schrieb Wolfgang Bangerth:
>
>
> Seb, 
>
> > of course I am willing to share my code. You can find it in the file 
> > attached. The parameter file is configured such that the 
> > GrowingVectorMemory error occurs. 
>
> Thanks. I've tried this with the current development version of deal.II 
> and the error disappears. I'm pretty sure I know why this is so -- the 
> solver does not converge, so it throws an exception and that led to a 
> vector that had been allocated not being freed (because we bypass the 
> memory_pool.free(...) call due to the exception). In the next step, the 
> memory pool object is being destroyed and complains that a vector that 
> had been allocated had not been freed, and that's why you get that error 
> before anything else. (Anything else = the convergence error.) 
>
> I fixed this a while back in the development version, though. Probably 
> here: 
>https://github.com/dealii/dealii/pull/4953 
>
> So with the current version, I only get to see the error about 
> non-convergence: 
>
> An error occurred in line <1052> of file 
>  
>
> in function 
>  void dealii::SolverGMRES::solve(const MatrixType&, 
> VectorType&, const VectorType&, const PreconditionerType&) [with 
> MatrixType = dealii::SparseMatrix; PreconditionerType = 
> dealii::SparseILU; VectorType = dealii::Vector] 
> The violated condition was: 
>  iteration_state == SolverControl::success 
> Additional information: 
> Iterative method reported convergence failure in step 1000. The residual 
> in the last step was 0.0018243. 
>
> [...] 
>
>
> > I think the error is due to a 
> > convergence failure of SolverGMRES inside the method 
> > BlockSchurPreconditioner::vmult. In this method the convection-diffusion 
> > system ((0,0)-block) is solved with GMRES and ILU-preconditioning. 
> > 
> > I investigated the behaviour of the preconditioner further. If the 
> > Reynolds number is decrease to say 100, the iterative solver for the 
> > convection-diffusion system converges. I am not an expert, but does 
> > ILU-preconditioning not work for larger Reynolds numbers? I thought ILU 
> > is robust (and expensive) but it should be a first good choice. 
>
> No -- at least not with the default settings. For high-Re cases, you 
> need to fill more off-diagonal entries in the ILU for it to be good. 
> There is a recent discussion on exactly this issue on the mailing list. 
>
>
> > As a second approach, I used a direct solver (SparseDirectUMFPACK) for 
> > the convection-diffusion matrix like in step-57. In this case, the issue 
> > with GMRES and ILU do not occur. Then, the FGMRES method converges for 
> > moderate Reynolds numbers of Re=200. However, for Re=400 convergence is 
> > not achieved anymore. I guess this is due to the bad approximation of 
> > the Schur complement by the pressure mass matrix. In step-57, using the 
> > pressure mass matrix somehow also works when solving the system for 
> > higher Reynolds numbers. Is this due to the Augmented Lagrange approach? 
>
> Probably. 
>
> Best 
>   W. 
>
> -- 
>  
> Wolfgang Bangerth  email: bang...@colostate.edu 
>  
> www: http://www.math.colostate.edu/~bangerth/ 
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Cycle 0:
   Number of active cells: 1024
   Number of degrees of freedom: 9539
   Number of velocity degrees of freedom: 8450
   Number of pressure degrees 

[deal.II] Re: Moving vertices of parallel triangulation breaks periodic face pair match

2017-12-05 Thread Sambit Das
Dear Dr. Arndt,

I am using the GridTools::collect_periodic_faces(..) as a sanity check 
after moving the triangulation. I donot again set periodicity constraints. 
The documentation also mentions "it is possible to call this function 
several times with different boundary ids to generate a vector with all 
periodic pairs". Moreover, in my actual application I never do refinement 
as I read a pre-generated mesh file, where the similar error occurs after 
moving the triangulation.

I did two more checks in the minimal example-
1) by not calling GridTools::collect_periodic_faces(..)  after refinement, 
I do not get any error messages. But is there a way to check whether the 
periodic match still holds in the moved triangulation without calling 
GridTools::collect_periodic_faces(..)?
2) by placing the GridTools::collect_periodic_faces(..) before moving the 
triangulation but after refinement, it worked fine on serial and parallel, 
which suggests something breaking after movement. 

Best,
Sambit

>
>
> your minimal example fails, because you are calling 
>   GridTools::collect_periodic_faces(triangulation, /*b_id1*/ 2*i+1, 
> /*b_id2*/ 2*i+2,/*direction*/ i, periodicity_vector);
> after 
>   triangulation.refine_global(2);
> again. As explained in the documentation 
> 
>  
> this is not unexpected.
>
> Does your issue pertain after making sure to call
>   GridTools::collect_periodic_faces(triangulation, /*b_id1*/ 2*i+1, 
> /*b_id2*/ 2*i+2,/*direction*/ i, periodicity_vector);
> only before mesh refinement?
>
> Best,
> Daniel
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [deal.II] Re: Cmake-MPI

2017-12-05 Thread Wolfgang Bangerth

On 12/05/2017 08:05 AM, feap...@gmail.com wrote:


I would ask, is It possible that I can delete all openmpi in ubuntu? then 
install a new one?


That depends on how you installed them. If you used a package manager, you can 
uninstall packages through the package manager.



The worst solution maybe reinstall ubuntu system as before  I have installed 
serveral versions of openmpi.


In the worst case yes. It's a pretty bad idea to have multiple versions of MPI 
on a system, as has been remarked many times on the mailing list.


Best
 W.

--

Wolfgang Bangerth  email: bange...@colostate.edu
   www: http://www.math.colostate.edu/~bangerth/

--
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups "deal.II User Group" group.

To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


Re: [deal.II] Re: Cmake-MPI

2017-12-05 Thread feapman
Dear Prof. Bangerth,

Many thanks for your help!

I would ask, is It possible that I can delete all openmpi in ubuntu? then 
install a new one?

The worst solution maybe reinstall ubuntu system as before  I have 
installed serveral versions of openmpi.

Kind regards,
Yaakov



On Tuesday, December 5, 2017 at 2:20:56 AM UTC+1, Wolfgang Bangerth wrote:
>
> On 12/04/2017 06:13 PM, fea...@gmail.com  wrote: 
> > 
> > /usr/include/petsc/petscsys.h:165:6: error: #error "PETSc was configured 
> > with OpenMPI but now appears to be compiling using a non-OpenMPI mpi.h" 
> >   #error "PETSc was configured with OpenMPI but now appears to be 
> > compiling using a non-OpenMPI mpi.h" 
> >^ 
> > source/numerics/CMakeFiles/obj_numerics_release.dir/build.make:62: 
> > recipe for target 
> > 'source/numerics/CMakeFiles/obj_numerics_release.dir/data_out.cc.o' 
> failed 
> > make[2]: *** 
> > [source/numerics/CMakeFiles/obj_numerics_release.dir/data_out.cc.o] 
> Error 1 
> > CMakeFiles/Makefile2:2989: recipe for target 
> > 'source/numerics/CMakeFiles/obj_numerics_release.dir/all' failed 
> > make[1]: *** [source/numerics/CMakeFiles/obj_numerics_release.dir/all] 
> > Error 2 
> > Makefile:129: recipe for target 'all' failed 
> > make: *** [all] Error 2 
> > 
> > I think, I don't use Petsc  Option for Deal.II ... 
> > 
> > would you let know how can I solve this new bug? 
>
> But clearly it is installed on your system, and using your previous 
> installation of MPI. 
>
> It is generally a poor idea to have multiple versions of MPI on the same 
> system. In your case, get rid of all of them, and then of all of the 
> packages you have built on top of these MPI installations. Then start 
> from scratch, pick one MPI package, and start rebuilding all of the 
> packages that need it. 
>
> Best 
>   W. 
>
> -- 
>  
> Wolfgang Bangerth  email: bang...@colostate.edu 
>  
> www: http://www.math.colostate.edu/~bangerth/ 
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


[deal.II] Re: Moving vertices of parallel triangulation breaks periodic face pair match

2017-12-05 Thread Daniel Arndt
Sambit,

your minimal example fails, because you are calling 
  GridTools::collect_periodic_faces(triangulation, /*b_id1*/ 2*i+1, 
/*b_id2*/ 2*i+2,/*direction*/ i, periodicity_vector);
after 
  triangulation.refine_global(2);
again. As explained in the documentation 

 
this is not unexpected.

Does your issue pertain after making sure to call
  GridTools::collect_periodic_faces(triangulation, /*b_id1*/ 2*i+1, 
/*b_id2*/ 2*i+2,/*direction*/ i, periodicity_vector);
only before mesh refinement?

Best,
Daniel

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.