Hello Timo!
Thanks for the support.
The problem with compiling DuMuX on the computational server seems to
be that it is running an old version of Ubuntu (and an old version of
gcc).
When I try the following command on the computation server:
$ ./dune-common/bin/dunecontrol --opts=dumux/debug.opts all
[...]
CMake Error at cmake/modules/CheckCXXFeatures.cmake:117 (message):
dune-common requires compiler support for C++14, but your compiler
only
supports C++11.
It aborts due to too old version of the compiler. I had the server
admin install clang-3.9 and libc++-dev on the server and then tried
again using clang instead of gcc. I used these flags with dune-control:
CMAKE_FLAGS="\
-DCMAKE_CXX_COMPILER=clang++-3.9 \
-DCMAKE_C_COMPILER=clang-3.9 \
-DCMAKE_CXX_FLAGS=\"$GXX_WARNING_OPTS $GXX_OPTS -stdlib=libc++\" \
"
However, still the test for C++14 fails (even if the clang compiler
itself supports C++14) in dune-control.
So now I tried instead to compile the binary on my laptop and then copy
the binary and some shared libraries onto the server. This worked
well for structured grid (YaspGrid) but for UGGrid I got the following
error message when I tried to run the executable:
It looks like opal_init failed for some reason; your parallel process is
likely to abort. There are many reasons that a parallel process can
fail during opal_init; some of which are due to configuration or
environment problems. This failure appears to be an internal failure;
here's some additional information (which may only be relevant to an
Open MPI developer):
opal_shmem_base_select failed
--> Returned value -1 instead of OPAL_SUCCESS
[...]
*** An error occurred in MPI_Init
*** on a NULL communicator
*** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
*** and potentially your MPI job)
[cipr-cmg-sim:67060] Local abort before MPI_INIT completed completed
successfully, but am not able to aggregate error messages, and not able
to guarantee that all other processes were killed!
So this is the reason why I am trying to compile uggrid without MPI
support.
Best regards,
Håkon Hægland
_______________________________________________
Dumux mailing list
[email protected]
https://listserv.uni-stuttgart.de/mailman/listinfo/dumux