Re: [deal.II] dealii installation issue on NERSC Cori

2020-10-23 Thread Aaditya Lakshmanan
Hi Wolfgang,
   I tried compiling, installing and running the deal.ii tests 
interactively on a KNL compute node. Both compilation and installation were 
seemingly successful but all the tests failed. I have attached the output 
from make test in the file *test_output.txt*. An example error looks like 
the following :
2/11 Test  #1: step.debug ...***Failed  265.99 sec
gmake[6]: *** write jobserver: Invalid argument.  Stop.
gmake[6]: *** Waiting for unfinished jobs
gmake[6]: *** write jobserver: Invalid argument.  Stop.
gmake[5]: *** [CMakeFiles/Makefile2:10829: 
tests/quick_tests/CMakeFiles/step.debug.run.dir/rule] Error 2
gmake[4]: *** [Makefile:3700: step.debug.run] Error 2
Test step.debug: BUILD
===   OUTPUT BEGIN  
===
step.debug: BUILD failed. Output:
[  0%] Built target obj_sundials_inst
[ 21%] Built target obj_physics_elasticity_inst
[ 30%] Built target obj_dofs_inst
[ 25%] Built target obj_matrix_free_inst
[ 23%] Built target obj_amd_global_debug
[ 34%] Built target obj_umfpack_DI_TRIPLET_MAP_NOX_debug
[ 30%] Built target obj_differentiation_ad_inst
[ 34%] Built target obj_multigrid_inst
[ 59%] Built target obj_physics_inst
[ 59%] Built target obj_non_matching_inst
[ 59%] Built target obj_gmsh_inst
[ 34%] Built target obj_distributed_inst
[ 34%] Built target kill-step.debug-OK
[ 34%] Built target obj_muparser_debug
[ 34%] Built target obj_umfpack_DL_SOLVE_debug
[ 59%] Built target obj_numerics_inst
[ 59%] Built target obj_boost_system_debug
[ 59%] Built target obj_umfpack_DL_STORE_debug
[ 59%] Built target obj_umfpack_DL_TRIPLET_MAP_NOX_debug
[ 59%] Built target obj_umfpack_DL_TRIPLET_MAP_X_debug
[ 59%] Built target obj_umfpack_DI_TSOLVE_debug
[ 78%] Built target obj_base_inst
[ 78%] Built target obj_grid_inst
[ 59%] Built target obj_umfpack_DI_SOLVE_debug
[ 34%] Built target obj_amd_int_debug
[ 59%] Built target obj_algorithms_inst
[ 59%] Built target obj_meshworker_inst
[ 59%] Built target obj_hp_inst
[ 78%] Built target obj_particle_inst
[ 59%] Built target obj_differentiation_sd_inst
[ 59%] Built target obj_opencascade_inst
[ 78%] Built target obj_lac_inst
[ 59%] Built target obj_amd_long_debug
[ 59%] Built target obj_umfpack_DL_TSOLVE_debug
[ 59%] Built target obj_umfpack_DI_STORE_debug
[ 59%] Built target obj_umfpack_DL_TRIPLET_NOMAP_X_debug
[ 59%] Built target obj_umfpack_DI_TRIPLET_MAP_X_debug
[ 59%] Built target obj_umfpack_DI_TRIPLET_NOMAP_X_debug
[ 59%] Built target obj_umfpack_GENERIC_debug
[ 78%] Built target obj_umfpack_DL_ASSEMBLE_debug
[ 59%] Built target obj_umfpack_DI_ASSEMBLE_debug
[ 84%] Built target obj_fe_inst


step.debug: **BUILD failed***

===OUTPUT END  
 ===
Expected stage PASSED - aborting
CMake Error at 
/global/project/projectdirs/m2360/packagesCPFE/dealii/cmake/scripts/run_test.cmake:140
 
(MESSAGE):
  *** abort


Best,
Aaditya

On Friday, October 23, 2020 at 2:00:00 AM UTC-4 Aaditya Lakshmanan wrote:

> Hi Wolfgang,
>Thanks for your suggestion. I just checked that the Cori login nodes 
> are Haswell processors while the compute nodes on which I wish to run 
> simulations eventually are KNL processors(certain modules are also with 
> that intention). I will use idev to access the compute nodes interactively 
> and try the build, installation and testing phases again. 
>
> Best,
> Aaditya
>
> On Thursday, October 22, 2020 at 11:29:38 PM UTC-4 Wolfgang Bangerth wrote:
>
>>
>> Aaditya, 
>> is NERSC Cori a machine where the front-end node runs a different 
>> processor/system than the compute nodes? If so, you're building a library 
>> for 
>> the compute nodes, and the tests are also built for the compute nodes. 
>> But 
>> you're trying to run the test executables on the front-end node -- which 
>> might 
>> help explain the error you see. 
>>
>> Best 
>> W. 
>>
>>
>> On 10/22/20 7:52 PM, Aaditya Lakshmanan wrote: 
>> > Hi Everyone, 
>> >I have been trying to install deal.ii on the NERSC Cori System and 
>> after a 
>> > seemingly successful cmake and make install, make test yields failure 
>> for all 
>> > the tests setup. I briefly detail the procedure I followed and the 
>> outputs and 
>> > errors obtained. 
>> > 
>> > As a pre-requisite I installed p4est using the setup script(with some 
>> > modifications) and petsc manually with the configure.py script and 
>> appropriate 
>> > settings. The following modules were loaded : 
>> > 
>> > Currently Loaded Modulefiles: 
>> >   1) modules/3.2.11.4  5) craype-network-aries  9) 
>> > PrgEnv-intel/6.0.5   13) valgrind/3.15.0 
>> >   2) altd/2.0  6) craype/2.6.2 10) 
>> > craype-mic-knl   14) zlib/1.2.11 
>> >   3) darshan/3.1.7 7) pmi/5.0.14   11) 
>> > cray-mpich/7.7.1415) cmake/3.14.4 
>> >   4) intel/19.0.3.199  8) atp/2.1.3

Re: [deal.II] Re: Pressure-correction scheme: Behaviour of the H1 norm on the Taylor-Green vortex

2020-10-23 Thread Timo Heister
> On another note, I remember having a discussion about this with Timo Heister 
> at the deal.II workshop in 2019. Maybe Timo has ideas on this? I know he is 
> quite the expert on algorithms to solve the Stokes / Navier-Stokes equations 
> (e.g. his paper on the grad-div scheme, etc.)

I am happy to give further comments, but -- like Wolfgang -- I don't
quite understand what the precise question is. That said:

1. With projection schemes you will need to be careful about pressure
boundary layers. A good starting point might be the Elman, Silvester,
Wathen book.
2. Specific numerical test setups can be more or less sensitive to
this fact (size of pressure error vs velocity error, smoothness of
solutions in time, specific behavior on the boundary, ...)


-- 
Timo Heister
http://www.math.clemson.edu/~heister/

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/CAMRj59H4J85VXiCWKYyj%3Df%3D%3DQFRH3T%3DCNDk%2Bw3XCjQT0ACj%3DuA%40mail.gmail.com.


Re: [deal.II] Periodic BC in parallel distributed homogenization problem

2020-10-23 Thread 'maurice rohracker' via deal.II User Group
Dear Daniel,

Thanks for your reply. 

We are having some offsets at the periodic boundary and therefore set it by 
ourself.

Adding the periodicity with 
`triangulation.add_periodicity(periodicFacePairVector)` before distributing 
the dofs as in step-45 helped.

We are using DoFTools::make_periodicity_constraints() later on and merge a 
new constraint into the existing one.

Best,
Maurice

d.arnd...@gmail.com schrieb am Mittwoch, 21. Oktober 2020 um 18:04:57 UTC+2:

> Maurice,
>
> Is there any reason that you don't use 
> DoFTools::make_periodicity_constraints() as shown in 
> https://www.dealii.org/current/doxygen/deal.II/step_45.html?
>
> Best,
> Daniel
>
> Am Mi., 21. Okt. 2020 um 11:16 Uhr schrieb 'maurice rohracker' via deal.II 
> User Group :
>
>> Dear all,
>>
>>
>> For a project, I try to implement a parallel distributed homogenization 
>> problem with periodic BC. A serial version is already implemented.
>>
>> The feature of the periodic BC in the serial case is that it is not 
>> purely periodic, but periodic with some offset.
>>
>>
>> Therefore the periodic boundary pairs are collected 
>> (`dealii::GridTools::collect_periodic_faces*(*
>>
>>  doFHandler, ...)`) and after that stored together with the DOF indices 
>> to handle the periodic BC by oneself afterwards.
>>
>>
>> For the parallel implementation, I go through the following steps:
>>
>>1. gather periodic face pairs for all directions using 
>>`dealii::GridTools::collect_periodic_faces(triangulation, ..., 
>>periodic_vector)`
>>2. add periodicity by `triangulation.add_periodicity(periodic_vector)`
>>3. collect periodic boundary pairs using 
>>`dealii::GridTools::collect_periodic_faces*(*
>>4.  doFHandler, ..., periodic_vector2)`
>>5. store the pairs with the DOFs by looping over the 
>>`periodic_vector2`
>>
>>
>>- to access the DOFs I tried the following
>>
>> `*for **(**const auto * :
>>
>>  periodic*_vector2)*
>>
>> *{*
>>
>>  (...)
>>
>>
>>  // loop over the DOFs of the boundary periodic pair cells
>>
>>  *if **(*facePair.cell*[*0*]*->is_locally_owned*() *&& facePair.cell*[*1
>> *]*->is_locally_owned*())*
>>
>> * {*
>>
>>  facePair.cell*[*0*]*->get_dof_indices*(*localDOFIndicesPlus*)*;
>>
>>  facePair.cell*[*1*]*->get_dof_indices*(*localDOFIndicesMinus*)*;
>>
>>  *}*
>>
>>  *else if **(*facePair.cell*[*0*]*->is_locally_owned*())*
>>
>> * {*
>>
>>  facePair.cell*[*0*]*->get_dof_indices*(*localDOFIndicesPlus*)*;
>>
>>  *(*facePair.cell*[*0*]*->periodic_neighbor*(*facePair.face_idx*[*0*]))*
>> ->get_dof_indices*(*localDOFIndicesMinus*)*;
>>
>>  *}*
>>
>>  *else if **(*facePair.cell*[*1*]*->is_locally_owned*())*
>>
>> * {*
>>
>> * (*facePair.cell*[*1*]*->periodic_neighbor*(*facePair.face_idx*[*1*]))*
>> ->get_dof_indices*(*localDOFIndicesPlus*)*;
>>
>>  facePair.cell*[*1*]*->get_dof_indices*(*localDOFIndicesMinus*)*;
>>
>>  *}*
>>
>>
>> (...) // store plus and minus point of the boundary together with dof if 
>> the periodicity of plus and minus point is fulfilled
>>
>> }`
>>
>>
>> Unfortunately, the resulting localDOFIndicesPlus/Minus of type 
>> `std::vector*<*dealii::types::global_dof_index*> *localDOFIndicesPlus*(*
>>
>>  nDOFsPerCell*)*;` only contain garbage values (very high numbers).
>>
>>
>> My first question would be, is it possible to access the DOF indices of 
>> ghost cells, as I did?
>>
>> Is our procedure enough to ensure that the periodic boundary cells of a 
>> triangulation owned by one processor are ghost cells of the corresponding 
>> triangulation owned by another processor?
>>
>>
>> Thanks in advance for your help. If there are any unclarities in my 
>> explanation, feel free to ask.
>>
>>
>> Best, Maurice
>>
>> -- 
>> The deal.II project is located at http://www.dealii.org/
>> For mailing list/forum options, see 
>> https://groups.google.com/d/forum/dealii?hl=en
>> --- 
>> You received this message because you are subscribed to the Google Groups 
>> "deal.II User Group" group.
>> To unsubscribe from this group and stop receiving emails from it, send an 
>> email to dealii+un...@googlegroups.com.
>> To view this discussion on the web visit 
>> https://groups.google.com/d/msgid/dealii/ce15100c-7daa-410b-bc91-6cd0288c9c3an%40googlegroups.com
>>  
>> 
>> .
>>
>

-- 
The deal.II project is located at http://www.dealii.org/
For mailing list/forum options, see 
https://groups.google.com/d/forum/dealii?hl=en
--- 
You received this message because you are subscribed to the Google Groups 
"deal.II User Group" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to dealii+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/dealii/b8b9d32f-13e7-4275-b461-0d5b93ff1008n%40googlegroups.com.


Re: [deal.II] dealii installation issue on NERSC Cori

2020-10-23 Thread Aaditya Lakshmanan
Hi Wolfgang,
   Thanks for your suggestion. I just checked that the Cori login nodes are 
Haswell processors while the compute nodes on which I wish to run 
simulations eventually are KNL processors(certain modules are also with 
that intention). I will use idev to access the compute nodes interactively 
and try the build, installation and testing phases again. 

Best,
Aaditya

On Thursday, October 22, 2020 at 11:29:38 PM UTC-4 Wolfgang Bangerth wrote:

>
> Aaditya,
> is NERSC Cori a machine where the front-end node runs a different 
> processor/system than the compute nodes? If so, you're building a library 
> for 
> the compute nodes, and the tests are also built for the compute nodes. But 
> you're trying to run the test executables on the front-end node -- which 
> might 
> help explain the error you see.
>
> Best
> W.
>
>
> On 10/22/20 7:52 PM, Aaditya Lakshmanan wrote:
> > Hi Everyone,
> >I have been trying to install deal.ii on the NERSC Cori System and 
> after a 
> > seemingly successful cmake and make install, make test yields failure 
> for all 
> > the tests setup. I briefly detail the procedure I followed and the 
> outputs and 
> > errors obtained.
> > 
> > As a pre-requisite I installed p4est using the setup script(with some 
> > modifications) and petsc manually with the configure.py script and 
> appropriate 
> > settings. The following modules were loaded :
> > 
> > Currently Loaded Modulefiles:
> >   1) modules/3.2.11.4  5) craype-network-aries  9) 
> > PrgEnv-intel/6.0.5   13) valgrind/3.15.0
> >   2) altd/2.0  6) craype/2.6.2 10) 
> > craype-mic-knl   14) zlib/1.2.11
> >   3) darshan/3.1.7 7) pmi/5.0.14   11) 
> > cray-mpich/7.7.1415) cmake/3.14.4
> >   4) intel/19.0.3.199  8) atp/2.1.312) 
> > craype-hugepages2M   16) cray-hdf5-parallel/1.10.5.2
> > 
> > 
> > After setting the environment variables P4EST_DIR, PETSC_DIR and 
> PETSC_ARCH, I 
> > compiled dealii-9.1.1 as follows :
> > 
> > *cmake -DCMAKE_SYSTEM_NAME=CrayLinuxEnvironment -DCMAKE_C_COMPILER=cc 
> > -DCMAKE_CXX_COMPILER=CC -DCMAKE_Fortran_COMPILER=ftn 
> -DDEAL_II_WITH_MPI=ON 
> > -DDEAL_II_WITH_PETSC=ON -DDEAL_II_WITH_P4EST=ON -DDEAL_II_WITH_LAPACK=ON 
> > -DCMAKE_INSTALL_PREFIX=$CPFE_PACKAGES/dealii_install  ../dealii
> > *
> > 
> > which ran successfully. I have attached the file *detailed.log*. Then 
> > installing via
> > 
> > *make -j8 install *
> > 
> > ran without issues. I have attached the output from the above command in 
> the 
> > file *make_output.txt*. Then running the tests as
> > 
> > *make -j4 test *
> > *
> > *
> > reports failure for all the tests. I have attached the entire output 
> from the 
> > above command in the file test_output.txt and an example error below :
> > 
> > 2/11 Test  #2: step.release .***Failed0.69 sec
> > gmake[6]: *** read jobs pipe: Bad file descriptor.  Stop.
> > gmake[6]: *** Waiting for unfinished jobs
> > gmake[5]: *** [CMakeFiles/Makefile2:10861: 
> > tests/quick_tests/CMakeFiles/step.release.run.dir/rule] Error 2
> > gmake[4]: *** [Makefile:3713: step.release.run] Error 2
> > Test step.release: BUILD
> > ===   OUTPUT BEGIN  
> ===
> > step.release: BUILD failed. Output:
> > [  1%] Built target obj_boost_iostreams_release
> > 
> > 
> > step.release: **BUILD failed***
> > 
> > ===OUTPUT END  
>  ===
> > Expected stage PASSED - aborting
> > CMake Error at 
> > 
> /global/project/projectdirs/m2360/packagesCPFE/dealii/cmake/scripts/run_test.cmake:140
>  
>
> > (MESSAGE):
> >   *** abort
> > 
> > I  am unable to understand what the issue might. Any insight on this 
> will be 
> > appreciated. Thank you.
> > 
> > Best,
> > Aaditya
> > 
> > -- 
> > The deal.II project is located at http://www.dealii.org/ 
> > <
> https://nam01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.dealii.org%2F=04%7C01%7CWolfgang.Bangerth%40colostate.edu%7C67682f15035d469a23df08d876f649b8%7Cafb58802ff7a4bb1ab21367ff2ecfc8b%7C0%7C0%7C637390147477029066%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000=snjl%2Fkw1PYq4xwYh0SGB03l6XvsgS8%2FfWr5sdHO4aI0%3D=0
> >
> > For mailing list/forum options, see 
> > https://groups.google.com/d/forum/dealii?hl=en 
> > <
> https://nam01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgroups.google.com%2Fd%2Fforum%2Fdealii%3Fhl%3Den=04%7C01%7CWolfgang.Bangerth%40colostate.edu%7C67682f15035d469a23df08d876f649b8%7Cafb58802ff7a4bb1ab21367ff2ecfc8b%7C0%7C0%7C637390147477039065%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000=QD7qw6euanb6yVAY0KMhGpNc0%2B0APSrikkenzbG4wGU%3D=0
> >
> > ---
> > You received this message because you are subscribed to the