Re: [deal.II] Re: Crack propagation

2018-02-19 Thread Thomas Wick
Dear Yaakov, which article do you mean? Please give the exact reference including author names. I do not know a priori whether they have different material parameters, another stress-strain splitting, etc. The reason for different results can be anything. One needs to do a careful 1-by-1

Re: [deal.II] Re: Error in installing deal.ii with spack on linux openSuse

2018-02-19 Thread Denis Davydov
Hi Alberto, Just a quick explanation why this happens: this is a current limitation in how Spack tries to find a configuration of packages which satisfies input request. At the point when you type: spack install dealii it does not yet know which version to use (you may have your custom setti

Re: [deal.II] Re: step-22 partial boundary conditions

2018-02-19 Thread Wolfgang Bangerth
Jane, Firstly, I decided not to use the normal vector for now. Since the normal vector is 2D, i wasn't sure how to implement the rest so that it is a 'double' since my g (neumann condition vector) has 3 components when the normal vector will only have 2? I'm not sure I understand this -- in

Re: [deal.II] Re: Error in installing deal.ii with spack on linux openSuse

2018-02-19 Thread Alberto Salvadori
Thank you! Alberto Salvadori Associate Professor DICATAM, University of Brescia, Italy > On 19 Feb 2018, at 16:06, Bruno Turcksin wrote: > > Alberto, > >> On Monday, February 19, 2018 at 9:56:25 AM UTC-5, Alberto Salvadori wrote: >> dealii requires cmake version :3.9.99, but spec asked for 3

Re: [deal.II] Issue with convergence of iterative linear solver for system matrix in modified step-57 with no-normal flux constraints

2018-02-19 Thread Bruno Blais
Hello, Sorry I feel I have not explained myself correctly. Here is a drawing of the case: With Ux may be a profile or a constant. Initially I had set-up that cas

[deal.II] Re: GridTools::collect_periodic_faces has problem with meshes generated by Gmsh or Abaqus when running in parallel.

2018-02-19 Thread Hamed Babaei
Dear Dr. Arndt, Thank you very much for your help. The problem was resolved applying your comment. I was setting the indicators running over locally owned cells not all the cells. Best regards, Hamed On Saturday, February 17, 2018 at 11:20:32 AM UTC-6, Daniel Arndt wrote: > > Hamed, > > for c

Re: [deal.II] Issue with convergence of iterative linear solver for system matrix in modified step-57 with no-normal flux constraints

2018-02-19 Thread Timo Heister
> Does the order in which I apply the nonzero and zero constraints matter? These are two independent objects, so no. > Currently I apply the inlet and then the no-slip in the nonzero_constraints, > thus the bottom and top wall appear after the inlet. Afterward the cylinder > is put in the zero co

Re: [deal.II] Usage of MPI_Bcast with deal.II

2018-02-19 Thread Timo Heister
Maxi, there is nothing special about MPI communication "with deal.II", it is just using MPI. I don't see enough information from your code snippet, so you have to investigate yourself. Some pointers: - make sure all processors participate - make sure module.data_size is the number of doubles, not

[deal.II] Re: Error in installing deal.ii with spack on linux openSuse

2018-02-19 Thread Bruno Turcksin
Alberto, On Monday, February 19, 2018 at 9:56:25 AM UTC-5, Alberto Salvadori wrote: > > dealii requires cmake version :3.9.99, but spec asked for 3.10.1 > deal.II 8.5 does not support cmake 3.10 so you need to force spack to use a lower version. Something like this should work spack dealii ^cmake

[deal.II] Error in installing deal.ii with spack on linux openSuse

2018-02-19 Thread Alberto Salvadori
Dear community, I am trying to install deal.ii on a linux machine, equipped with openSuse ( openSUSE 12.3 (x86_64) VERSION = 12.3 CODENAME = Dartmouth) via spack. I am having the following error: spack install dealii *==>* Error: An unsatisfiable version constraint has been detected for spe

Re: [deal.II] Automatic refinement with minimal level of refinement

2018-02-19 Thread Wolfgang Bangerth
On 02/19/2018 05:37 AM, luca.heltai wrote: After you have created your coarse mesh, you could GridTools::flatten_triangulation it. This will create a brand new coarse triangulation, containing only all your active cells. Alternatively, between the call to GridRefinement::refine_...() and Tri

Re: [deal.II] Automatic refinement with minimal level of refinement

2018-02-19 Thread Lucas Campos
Perfect! Thank you! On Monday, 19 February 2018 13:37:38 UTC+1, Luca Heltai wrote: > > After you have created your coarse mesh, you could > GridTools::flatten_triangulation it. This will create a brand new coarse > triangulation, containing only all your active cells. > > L. > > > On 19 Feb 20

Re: [deal.II] Automatic refinement with minimal level of refinement

2018-02-19 Thread luca.heltai
After you have created your coarse mesh, you could GridTools::flatten_triangulation it. This will create a brand new coarse triangulation, containing only all your active cells. L. > On 19 Feb 2018, at 13:35, Lucas Campos wrote: > > Dear all, > > I am creating a mesh using GridGenerator::hy

[deal.II] Automatic refinement with minimal level of refinement

2018-02-19 Thread Lucas Campos
Dear all, I am creating a mesh using GridGenerator::hyper_shell. In this mesh, I want to divide the mesh into two distinct subdomains, radially. In order to do that, I need to refine the mesh at least once. Each of these cells carries an internal state that depends on its kind. The issue I ha

[deal.II] Usage of MPI_Bcast with deal.II

2018-02-19 Thread 'Maxi Miller' via deal.II User Group
Hei, I was wondering what the correct way is to use MPI_Bcast in a deal.II-program. Currently I am using it in the following way: print_status_update(pcout, std::string("Transmitting pulse data of size ") + std::to_string(module.data_size) + std::string("\n"), true); //MPI_Bcast(m

Re: [deal.II] Re: Crack propagation

2018-02-19 Thread Thomas Wick
Dear Yaakov, which article do you mean? Please give the exact reference including author names. I do not know a priori whether they have different material parameters, another stress-strain splitting, etc. The reason for different results can be anything. One needs to do a careful 1-by-1