[deal.II] Re: Hanging node constraints and periodic constraints together causing an issue

2018-01-23 Thread Sambit Das
Hello Dr. Arndt, The above fix resolved the issue in the minimal example. Thanks a lot for providing the fix. Best, Sambit On Tuesday, January 23, 2018 at 6:16:02 AM UTC-6, Daniel Arndt wrote: > > Sambit, > > Please try if https://github.com/dealii/dealii/pull/5779 fixes the issue > for you.

Re: [deal.II] Re: Parallel DoFRenumbering does not work on mac

2018-01-23 Thread Jie Cheng
Hi Wolfgang Here is the program, I did not but adding a DoFRenumbering::Cuthill_McKee(dof_handler) call in 251. I tried to debug with lldb but did not gain any useful information. I will check the video again to make sure I was doing it right. Thank you very much! Jie On Tue, Jan 23, 2018 at

Re: [deal.II] Re: Parallel DoFRenumbering does not work on mac

2018-01-23 Thread Wolfgang Bangerth
On 01/23/2018 01:16 PM, Jie Cheng wrote: These are just warnings -- what happens if you run the executable? If I do not modify step-40.cc, it runs fine both in serial and parallel. After I added DoFRenumbering in it, it crashes in parallel. Does DoFRenumbering have any dependency? Not

Re: [deal.II] Re: installation error

2018-01-23 Thread Wolfgang Bangerth
On 01/23/2018 02:13 PM, Bruno Turcksin wrote: mypath/dealii/source/lac/scalapack.cc:243:91: error: there are no arguments to ‘MPI_Comm_create_group’ that depend on a template parameter, so a declaration of ‘MPI_Comm_create_group’ must be available [-fpermissive] ierr =

Re: [deal.II] Re: Run function on one node, make result accessible for all nodes

2018-01-23 Thread Wolfgang Bangerth
On 01/23/2018 01:59 PM, 'Maxi Miller' via deal.II User Group wrote: Assumed I want to use either MPI_Bcast or a function from Boost::MPI, what do I have to do to initialize them, and what is already done by deal.II? If you initialize the MPI system as we do, for example, in the main() function

[deal.II] Re: installation error

2018-01-23 Thread Bruno Turcksin
Juan Carlos On Tuesday, January 23, 2018 at 3:12:20 PM UTC-5, Juan Carlos Araujo Cabarcas wrote: > > [ 50%] Building CXX object > source/fe/CMakeFiles/obj_fe_debug.dir/fe_poly.cc.o > mypath/dealii/source/lac/scalapack.cc: In member function ‘void >

Re: [deal.II] Re: Run function on one node, make result accessible for all nodes

2018-01-23 Thread 'Maxi Miller' via deal.II User Group
Assumed I want to use either MPI_Bcast or a function from Boost::MPI, what do I have to do to initialize them, and what is already done by deal.II? Thanks! Am Dienstag, 23. Januar 2018 19:18:27 UTC+1 schrieb Wolfgang Bangerth: > > On 01/23/2018 06:40 AM, 'Maxi Miller' via deal.II User Group

Re: [deal.II] Re: Parallel DoFRenumbering does not work on mac

2018-01-23 Thread Jie Cheng
Hi Wolfgang > These are just warnings -- what happens if you run the executable? > If I do not modify step-40.cc, it runs fine both in serial and parallel. After I added DoFRenumbering in it, it crashes in parallel. Does DoFRenumbering have any dependency? As I posted in previous messages,

[deal.II] installation error

2018-01-23 Thread Juan Carlos Araujo Cabarcas
Dear all, I am trying to install deal.II from the GIT repository with the following features: petsc_ver='3.6.0'; trilinos_ver='12.4.2'; git clone https://github.com/dealii/dealii.git dealii cmake \ -DTRILINOS_DIR=${install_dir}/trilinos-${trilinos_ver}

Re: [deal.II] Re: Usage of the laplace-matrix in example 23

2018-01-23 Thread Wolfgang Bangerth
On 01/23/2018 10:35 AM, Dulcimer0909 wrote: If I do go ahead and replace code, so that it does a cell by cell assembly, I am a bit lost on how I would store the old_solution (U^(n-1)) for each cell and retrieve it during the assembly for the Rhs. Dulcimer -- can you elaborate? It's not

Re: [deal.II] Deal.ii installation problem:- Step 40 runtime error

2018-01-23 Thread RAJAT ARORA
Hello Professor, Thanks for the reply. I had been struggling with this issue for 5 days now. I raised the ticket on XSEDE forum on 18th Jan. The technical team was expecting everything was fine from their side and advised me to reinstall with some different modules loaded. They were also

Re: [deal.II] Re: Parallel DoFRenumbering does not work on mac

2018-01-23 Thread Wolfgang Bangerth
On 01/22/2018 09:17 PM, Jie Cheng wrote: I've reinstalled MPICH, and did a clean build of p4est, petsc and dealii, this problem still exists. At the linking stage of building dealii, I got warnings: [526/579] Linking CXX shared library lib/libdeal_II.9.0.0-pre.dylib ld: warning: could not

Re: [deal.II] Re: Run function on one node, make result accessible for all nodes

2018-01-23 Thread Wolfgang Bangerth
On 01/23/2018 06:40 AM, 'Maxi Miller' via deal.II User Group wrote: So, a solution would be calling MPI_Bcast() after every call in the if()-loop in the run()-function? Thanks! Yes. After each if-statement, process 0 has to broadcast the information it has computed to all of the processors

[deal.II] Re: Usage of the laplace-matrix in example 23

2018-01-23 Thread Dulcimer0909
Hello all, An additional question regarding this thread: If I do go ahead and replace code, so that it does a cell by cell assembly, I am a bit lost on how I would store the old_solution (U^(n-1)) for each cell and retrieve it during the assembly for the Rhs. grateful if anyone could help.

[deal.II] Using compute_nonzero_normal_flux_constraints and local refinement

2018-01-23 Thread markus . loewenstein1990
Hello my name is Markus, last week I started my first project with deal.II. I used step-8 (linear-elasticity) as starting point, built my own grid_geometry and it worked fine. Then I added new boundary conditions: hanging_node_constraints.condense (system_matrix);

[deal.II] Re: Run function on one node, make result accessible for all nodes

2018-01-23 Thread 'Maxi Miller' via deal.II User Group
So, a solution would be calling MPI_Bcast() after every call in the if()-loop in the run()-function? Thanks! Am Dienstag, 23. Januar 2018 14:31:58 UTC+1 schrieb Bruno Turcksin: > > Hi, > > On Tuesday, January 23, 2018 at 7:53:16 AM UTC-5, Maxi Miller wrote: > >> But now it looks like as if only

[deal.II] Re: Run function on one node, make result accessible for all nodes

2018-01-23 Thread Bruno Turcksin
Hi, On Tuesday, January 23, 2018 at 7:53:16 AM UTC-5, Maxi Miller wrote: > But now it looks like as if only the first node gets the result of the > calculations, but the others do not, instead defaulting to the default > values of the calculation function when not initialized. Is there a way I

[deal.II] Re: Hanging node constraints and periodic constraints together causing an issue

2018-01-23 Thread Daniel Arndt
Sambit, Please try if https://github.com/dealii/dealii/pull/5779 fixes the issue for you. Best, Daniel Am Dienstag, 16. Januar 2018 22:06:55 UTC+1 schrieb Sambit Das: > > Thank you, Dr. Arndt. > > Best, > Sambit > > On Tuesday, January 16, 2018 at 11:16:08 AM UTC-6, Daniel Arndt wrote: >> >>

[deal.II] Compile trilinos with Intel MKL

2018-01-23 Thread Mark Ma
Hello everyone, For this discussion, I just want to post the working setup in some clusters that uses intel MKL instead of BLAS ans LAPACK. Here is the code, "thrilino_setup.sh", mkdir build cd build cmake\