After removing the deb package python-six and building a newer version of
six, fenics compiles again.
I still get run-time errors, when running a few short examples. Attaching
the output. Any idea what is going on?
Also, is there a way to build a full test-suite? Or download?
Thanks, Birgitte
--
Birgitte Maria E. Brydsö
HPC2N, MIT-Huset
Umeå University
SE-901 87 Umeå
Telephone: +46(0)90-786 64 55
Email: [email protected]
BVP_01:
Solve -u'' + u = x, 0 < x < 1
u(0) = 0, u(1) = 0
Exact solution is u(x) = x - sinh(x)/sinh(1)
Building mesh (dist 0a)
Number of global vertices: 9
Number of global cells: 8
Building mesh (dist 1a)
Calling FFC just-in-time (JIT) compiler, this may take some time.
--------------------------------------------------------------------------
An MPI process has executed an operation involving a call to the
"fork()" system call to create a child process. Open MPI is currently
operating in a condition that could result in memory corruption or
other system errors; your MPI job may hang, crash, or produce silent
data corruption. The use of fork() (or system() or other calls that
create child processes) is strongly discouraged.
The process that invoked fork was:
Local host: t-cn1123.hpc2n.umu.se (PID 44779)
MPI_COMM_WORLD rank: 0
If you are *absolutely sure* that your application will successfully
and correctly survive a call to fork(), you may disable this warning
by setting the mpi_warn_on_fork MCA parameter to 0.
--------------------------------------------------------------------------
Calling DOLFIN just-in-time (JIT) compiler, this may take some time.
Calling DOLFIN just-in-time (JIT) compiler, this may take some time.
Calling FFC just-in-time (JIT) compiler, this may take some time.
Calling FFC just-in-time (JIT) compiler, this may take some time.
Solving linear variational problem.
BVP_01:
Solve -u'' + u = x, 0 < x < 1
u(0) = 0, u(1) = 0
Exact solution is u(x) = x - sinh(x)/sinh(1)
Solving linear variational problem.
BVP_01:
Solve -u'' + u = x, 0 < x < 1
u(0) = 0, u(1) = 0
Exact solution is u(x) = x - sinh(x)/sinh(1)
Solving linear variational problem.
BVP_01:
Solve -u'' + u = x, 0 < x < 1
u(0) = 0, u(1) = 0
Exact solution is u(x) = x - sinh(x)/sinh(1)
Solving linear variational problem.
Traceback (most recent call last):
File "./bvp_01.py", line 119, in <module>
bvp_01 ( 8 )
File "./bvp_01.py", line 93, in bvp_01
Traceback (most recent call last):
File "./bvp_01.py", line 119, in <module>
mesh_file << mesh
RuntimeError:
*** -------------------------------------------------------------------------
*** DOLFIN encountered an error. If you are not able to resolve this issue
*** using the information listed below, you can ask for help at
***
*** [email protected]
***
*** Remember to include the error message listed below and, if possible,
*** include a *minimal* running example to reproduce the error.
***
*** -------------------------------------------------------------------------
*** Error: Unable to write mesh to XML file in parallel.
*** Reason: Parallel XML mesh output is not supported. Use HDF5 format instead.
*** Where: This error was encountered inside XMLFile.cpp.
*** Process: unknown
***
*** DOLFIN version: 1.5.0
*** Git changeset:
*** -------------------------------------------------------------------------
bvp_01 ( 8 )
Traceback (most recent call last):
File "./bvp_01.py", line 119, in <module>
File "./bvp_01.py", line 93, in bvp_01
bvp_01 ( 8 )
File "./bvp_01.py", line 93, in bvp_01
Traceback (most recent call last):
File "./bvp_01.py", line 119, in <module>
mesh_file << mesh
RuntimeError:
*** -------------------------------------------------------------------------
*** DOLFIN encountered an error. If you are not able to resolve this issue
*** using the information listed below, you can ask for help at
***
*** [email protected]
***
*** Remember to include the error message listed below and, if possible,
*** include a *minimal* running example to reproduce the error.
***
*** -------------------------------------------------------------------------
*** Error: Unable to write mesh to XML file in parallel.
*** Reason: Parallel XML mesh output is not supported. Use HDF5 format instead.
*** Where: This error was encountered inside XMLFile.cpp.
*** Process: unknown
***
*** DOLFIN version: 1.5.0
*** Git changeset:
*** -------------------------------------------------------------------------
mesh_file << mesh
RuntimeError:
*** -------------------------------------------------------------------------
*** DOLFIN encountered an error. If you are not able to resolve this issue
*** using the information listed below, you can ask for help at
***
*** [email protected]
***
*** Remember to include the error message listed below and, if possible,
*** include a *minimal* running example to reproduce the error.
***
*** -------------------------------------------------------------------------
*** Error: Unable to write mesh to XML file in parallel.
*** Reason: Parallel XML mesh output is not supported. Use HDF5 format instead.
*** Where: This error was encountered inside XMLFile.cpp.
*** Process: unknown
***
*** DOLFIN version: 1.5.0
*** Git changeset:
*** -------------------------------------------------------------------------
bvp_01 ( 8 )
File "./bvp_01.py", line 93, in bvp_01
mesh_file << mesh
RuntimeError:
*** -------------------------------------------------------------------------
*** DOLFIN encountered an error. If you are not able to resolve this issue
*** using the information listed below, you can ask for help at
***
*** [email protected]
***
*** Remember to include the error message listed below and, if possible,
*** include a *minimal* running example to reproduce the error.
***
*** -------------------------------------------------------------------------
*** Error: Unable to write mesh to XML file in parallel.
*** Reason: Parallel XML mesh output is not supported. Use HDF5 format instead.
*** Where: This error was encountered inside XMLFile.cpp.
*** Process: unknown
***
*** DOLFIN version: 1.5.0
*** Git changeset:
*** -------------------------------------------------------------------------
srun: error: t-cn1123: tasks 0-3: Exited with exit code 1
BVP_02:
Solve -u'' + u = x, 0 < x < 1
u(0) = 0, u(1) = 0
Exact solution is u(x) = x - sinh(x)/sinh(1)
Building mesh (dist 0a)
Number of global vertices: 9
Number of global cells: 8
Building mesh (dist 1a)
Calling DOLFIN just-in-time (JIT) compiler, this may take some time.
--------------------------------------------------------------------------
An MPI process has executed an operation involving a call to the
"fork()" system call to create a child process. Open MPI is currently
operating in a condition that could result in memory corruption or
other system errors; your MPI job may hang, crash, or produce silent
data corruption. The use of fork() (or system() or other calls that
create child processes) is strongly discouraged.
The process that invoked fork was:
Local host: t-cn1123.hpc2n.umu.se (PID 45494)
MPI_COMM_WORLD rank: 0
If you are *absolutely sure* that your application will successfully
and correctly survive a call to fork(), you may disable this warning
by setting the mpi_warn_on_fork MCA parameter to 0.
--------------------------------------------------------------------------
Solving linear variational problem.
BVP_02:
Solve -u'' + u = x, 0 < x < 1
u(0) = 0, u(1) = 0
Exact solution is u(x) = x - sinh(x)/sinh(1)
Solving linear variational problem.
BVP_02:
Solve -u'' + u = x, 0 < x < 1
u(0) = 0, u(1) = 0
Exact solution is u(x) = x - sinh(x)/sinh(1)
Solving linear variational problem.
BVP_02:
Solve -u'' + u = x, 0 < x < 1
u(0) = 0, u(1) = 0
Exact solution is u(x) = x - sinh(x)/sinh(1)
Solving linear variational problem.
X U(X)
0.750000 0.050336
0.812500 0.041055
0.875000 0.031773
0.937500 0.015886
Traceback (most recent call last):
File "./bvp_02.py", line 193, in <module>
X U(X)
0.500000 0.056657
0.562500 0.057298
0.625000 0.057938
0.687500 0.054137
Traceback (most recent call last):
File "./bvp_02.py", line 193, in <module>
X U(X)
0.000000 -0.000000
0.062500 0.009190
0.125000 0.018379
0.187500 0.026734
Traceback (most recent call last):
File "./bvp_02.py", line 193, in <module>
X U(X)
0.250000 0.035089
0.312500 0.041760
0.375000 0.048431
0.437500 0.052544
Traceback (most recent call last):
File "./bvp_02.py", line 193, in <module>
bvp_02 ( 8 )
bvp_02 ( 8 )
bvp_02 ( 8 )
File "./bvp_02.py", line 113, in bvp_02
File "./bvp_02.py", line 113, in bvp_02
File "./bvp_02.py", line 113, in bvp_02
bvp_02 ( 8 )
File "./bvp_02.py", line 113, in bvp_02
xr = x[e+1]
xr = x[e+1]
xr = x[e+1]
IndexError: index 3 is out of bounds for axis 0 with size 3
IndexError: index 3 is out of bounds for axis 0 with size 3
IndexError: index 3 is out of bounds for axis 0 with size 3
xr = x[e+1]
IndexError: index 3 is out of bounds for axis 0 with size 3
srun: error: t-cn1123: tasks 0-3: Exited with exit code 1
bvp_03
BVP_03:
Solve -u'' = 2x/(1+x^2)^2, -8 < x < +8
u(0) = u0, u(1) = u1
Exact solution is u(x) = arctangent(x)
Calling DOLFIN just-in-time (JIT) compiler, this may take some time.
--------------------------------------------------------------------------
An MPI process has executed an operation involving a call to the
"fork()" system call to create a child process. Open MPI is currently
operating in a condition that could result in memory corruption or
other system errors; your MPI job may hang, crash, or produce silent
data corruption. The use of fork() (or system() or other calls that
create child processes) is strongly discouraged.
The process that invoked fork was:
Local host: t-cn1123.hpc2n.umu.se (PID 45835)
MPI_COMM_WORLD rank: 0
If you are *absolutely sure* that your application will successfully
and correctly survive a call to fork(), you may disable this warning
by setting the mpi_warn_on_fork MCA parameter to 0.
--------------------------------------------------------------------------
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably
memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC ERROR: or
try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory
corruption errors
[0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
[0]PETSC ERROR: to get more information on the crash.
_______________________________________________
fenics-support mailing list
[email protected]
http://fenicsproject.org/mailman/listinfo/fenics-support