Your message dated Sun, 27 Dec 2020 21:48:47 +0100
with message-id <20201227204847.ga24...@xanadu.blop.info>
and subject line bug fixed in openmpi 4.1.0-2
has caused the Debian Bug report #978260,
regarding xmds2: FTBFS: AssertionError: False is not true : Failed to execute 
compiled simulation correctly. Got returnCode 1;
to be marked as done.

This means that you claim that the problem has been dealt with.
If this is not the case it is now your responsibility to reopen the
Bug report if necessary, and/or fix the problem forthwith.

(NB: If you are a system administrator and have no idea what this
message is talking about, this may indicate a serious mail system
misconfiguration somewhere. Please contact ow...@bugs.debian.org
immediately.)


-- 
978260: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=978260
Debian Bug Tracking System
Contact ow...@bugs.debian.org with problems
--- Begin Message ---
Source: xmds2
Version: 3.0.0+dfsg-4
Severity: serious
Justification: FTBFS on amd64
Tags: bullseye sid ftbfs
Usertags: ftbfs-20201226 ftbfs-bullseye

Hi,

During a rebuild of all packages in sid, your package failed to build
on amd64.

Relevant part (hopefully):
> make[1]: Entering directory '/<<PKGBUILDDIR>>'
> debian/tests/run-tests -b
> xmds2 version 3.0.0 "Release the Kraken" (Debian package 3.0.0+dfsg-4)
> Copyright 2000-2019 Graham Dennis, Joseph Hope, Mattias Johnsson
>                     and the xmds team
> 
> Configuring for single-process scripts...
> Checking for 'g++' (C++ compiler)        : /usr/bin/g++ 
> Checking whether the compiler works      : yes 
> Checking that we have a C++ compiler     : yes 
> Checking whether we are cross-compiling  : no 
> Checking whether we can link to only static libraries : yes 
> Trying to make compiler optimise for this machine     : yes 
> Trying to make compiler tune for this machine         : yes 
> Checking for compiler flags -O3                       : yes 
> Checking for compiler flags -ffast-math               : yes 
> Checking for compiler flags -funroll-all-loops        : yes 
> Checking for compiler flags -fomit-frame-pointer      : yes 
> Checking for compiler flags -falign-loops             : yes 
> Checking for compiler flags -fstrict-aliasing         : yes 
> Checking for compiler flags -momit-leaf-frame-pointer : yes 
> Checking for cautious math flags                      : yes 
> Checking for Autovectorisation                        : yes 
> Checking for OpenMP                                   : yes 
> Checking for pthreads                                 : yes 
> Checking for compiler debug flags                     : yes 
> Checking for srandomdev                               : yes 
> Checking for /dev/urandom                             : yes 
> Checking for program 'h5cc'                           : /usr/bin/h5cc 
> Checking for HDF5 (static library)                    : yes 
> Checking for HDF5 High-level library (static library) : yes 
> Checking for header hdf5.h                            : yes 
> Checking for libxmds (static library)                 : no (will try dynamic 
> library instead) 
> Checking for libxmds (dynamic library)                : no (it's optional 
> anyway) 
> Checking for Intel's Vector Math Library (static library) : no (will try 
> dynamic library instead) 
> Checking for Intel's Vector Math Library (dynamic library) : no (it's 
> optional anyway) 
> Checking safer dSFMT compile flags                         : yes 
> Checking for Intel's Math Kernel Library (static library)  : no (will try 
> dynamic library instead) 
> Checking for Intel's Math Kernel Library (dynamic library) : no (it's 
> optional anyway) 
> Checking for ATLAS's CBLAS. (static library)               : no (will try 
> dynamic library instead) 
> Checking for ATLAS's CBLAS. (dynamic library)              : yes 
> Checking for GSL (static library)                          : yes 
> Checking for FFTW3 (static library)                        : yes 
> Checking for single-precision FFTW3 (static library)       : yes 
> Checking for threading support in FFTW3 (static library)   : yes 
> Checking for OpenMP support in FFTW3 (static library)      : yes 
> Checking for threading support in single-precision FFTW3 (static library) : 
> yes 
> Checking for OpenMP support in single-precision FFTW3 (static library)    : 
> yes 
> 
> Configuring for MPI scripts...
> Checking for program 'mpic++'                                             : 
> /usr/bin/mpic++ 
> Checking for 'g++' (C++ compiler)                                         : 
> /usr/bin/mpic++ 
> Checking whether the compiler works                                       : 
> yes 
> Checking that we have a C++ compiler                                      : 
> yes 
> Checking whether we are cross-compiling                                   : 
> no 
> Checking whether we can link to only static libraries                     : 
> yes 
> Trying to make compiler optimise for this machine                         : 
> yes 
> Trying to make compiler tune for this machine                             : 
> yes 
> Checking for compiler flags -O3                                           : 
> yes 
> Checking for compiler flags -ffast-math                                   : 
> yes 
> Checking for compiler flags -funroll-all-loops                            : 
> yes 
> Checking for compiler flags -fomit-frame-pointer                          : 
> yes 
> Checking for compiler flags -falign-loops                                 : 
> yes 
> Checking for compiler flags -fstrict-aliasing                             : 
> yes 
> Checking for compiler flags -momit-leaf-frame-pointer                     : 
> yes 
> Checking for cautious math flags                                          : 
> yes 
> Checking for Autovectorisation                                            : 
> yes 
> Checking for OpenMP                                                       : 
> yes 
> Checking for pthreads                                                     : 
> yes 
> Checking for compiler debug flags                                         : 
> yes 
> Checking for srandomdev                                                   : 
> yes 
> Checking for /dev/urandom                                                 : 
> yes 
> Checking for program 'h5cc'                                               : 
> /usr/bin/h5cc 
> Checking for HDF5 (static library)                                        : 
> yes 
> Checking for HDF5 High-level library (static library)                     : 
> yes 
> Checking for header hdf5.h                                                : 
> yes 
> Checking for libxmds (static library)                                     : 
> no (will try dynamic library instead) 
> Checking for libxmds (dynamic library)                                    : 
> no (it's optional anyway) 
> Checking for Intel's Vector Math Library (static library)                 : 
> no (will try dynamic library instead) 
> Checking for Intel's Vector Math Library (dynamic library)                : 
> no (it's optional anyway) 
> Checking safer dSFMT compile flags                                        : 
> yes 
> Checking for Intel's Math Kernel Library (static library)                 : 
> no (will try dynamic library instead) 
> Checking for Intel's Math Kernel Library (dynamic library)                : 
> no (it's optional anyway) 
> Checking for ATLAS's CBLAS. (static library)                              : 
> no (will try dynamic library instead) 
> Checking for ATLAS's CBLAS. (dynamic library)                             : 
> yes 
> Checking for GSL (static library)                                         : 
> yes 
> Checking for FFTW3 (static library)                                       : 
> yes 
> Checking for single-precision FFTW3 (static library)                      : 
> yes 
> Checking for threading support in FFTW3 (static library)                  : 
> yes 
> Checking for OpenMP support in FFTW3 (static library)                     : 
> yes 
> Checking for threading support in single-precision FFTW3 (static library) : 
> yes 
> Checking for OpenMP support in single-precision FFTW3 (static library)    : 
> yes 
> Checking for FFTW3 with MPI (static library)                              : 
> yes 
> Checking for single-precision FFTW3 with MPI (static library)             : 
> yes 
> ('Config log saved to ', 
> '/<<PKGBUILDDIR>>/debian/xmds-user-data/waf_configure/config.log')
> test_bug_adaptive_timestep_hang (__main__.main.<locals>.ScriptTestCase)
> integrators/bug_adaptive_timestep_hang.xmds ... ok
> test_vibstring_ark45 (__main__.main.<locals>.ScriptTestCase)
> integrators/vibstring_ark45.xmds ... ok
> test_vibstring_ark89 (__main__.main.<locals>.ScriptTestCase)
> integrators/vibstring_ark89.xmds ... ok
> test_vibstring_mm (__main__.main.<locals>.ScriptTestCase)
> integrators/vibstring_mm.xmds ... ok
> test_vibstring_re (__main__.main.<locals>.ScriptTestCase)
> integrators/vibstring_re.xmds ... ok
> test_vibstring_rk4 (__main__.main.<locals>.ScriptTestCase)
> integrators/vibstring_rk4.xmds ... ok
> test_vibstring_rk45 (__main__.main.<locals>.ScriptTestCase)
> integrators/vibstring_rk45.xmds ... ok
> test_vibstring_rk89 (__main__.main.<locals>.ScriptTestCase)
> integrators/vibstring_rk89.xmds ... ok
> test_vibstring_rk9 (__main__.main.<locals>.ScriptTestCase)
> integrators/vibstring_rk9.xmds ... ok
> test_vibstring_si (__main__.main.<locals>.ScriptTestCase)
> integrators/vibstring_si.xmds ... ok
> test_RbGSdipoles (__main__.main.<locals>.ScriptTestCase)
> fast/RbGSdipoles.xmds ... FAIL
> test_bessel_cosine_evolution (__main__.main.<locals>.ScriptTestCase)
> fast/bessel_cosine_evolution.xmds ... ok
> test_bessel_cosine_groundstate (__main__.main.<locals>.ScriptTestCase)
> fast/bessel_cosine_groundstate.xmds ... ok
> test_cpc_example4 (__main__.main.<locals>.ScriptTestCase)
> fast/cpc_example4.xmds ... FAIL
> test_eigenvalues (__main__.main.<locals>.ScriptTestCase)
> fast/eigenvalues.xmds ... ok
> test_groundstate (__main__.main.<locals>.ScriptTestCase)
> fast/groundstate.xmds ... ok
> test_lorenz (__main__.main.<locals>.ScriptTestCase)
> fast/lorenz.xmds ... ok
> test_tla (__main__.main.<locals>.ScriptTestCase)
> fast/tla.xmds ... ok
> test_transverse_integration_in_vector_initialisation 
> (__main__.main.<locals>.ScriptTestCase)
> fast/transverse_integration_in_vector_initialisation.xmds ... ok
> test_vibstring_circle_spectral (__main__.main.<locals>.ScriptTestCase)
> fast/vibstring_circle_spectral.xmds ... ok
> test_constant_complex_ip (__main__.main.<locals>.ScriptTestCase)
> operators/constant_complex_ip.xmds ... ok
> test_constant_complex_ip_2d (__main__.main.<locals>.ScriptTestCase)
> operators/constant_complex_ip_2d.xmds ... ok
> test_constant_complex_separated_ip_2d (__main__.main.<locals>.ScriptTestCase)
> operators/constant_complex_separated_ip_2d.xmds ... ok
> test_constant_double_ip (__main__.main.<locals>.ScriptTestCase)
> operators/constant_double_ip.xmds ... ok
> test_constant_ex (__main__.main.<locals>.ScriptTestCase)
> operators/constant_ex.xmds ... ok
> test_constant_ex_arbitrary_code (__main__.main.<locals>.ScriptTestCase)
> operators/constant_ex_arbitrary_code.xmds ... ok
> test_constant_ex_arbitrary_order (__main__.main.<locals>.ScriptTestCase)
> operators/constant_ex_arbitrary_order.xmds ... ok
> test_constant_ex_arbitrary_order2 (__main__.main.<locals>.ScriptTestCase)
> operators/constant_ex_arbitrary_order2.xmds ... ok
> test_constant_real_ip_2d (__main__.main.<locals>.ScriptTestCase)
> operators/constant_real_ip_2d.xmds ... ok
> test_cross_propagation (__main__.main.<locals>.ScriptTestCase)
> operators/cross_propagation.xmds ... ok
> test_cross_propagation2 (__main__.main.<locals>.ScriptTestCase)
> operators/cross_propagation2.xmds ... ok
> test_cross_propagation_right (__main__.main.<locals>.ScriptTestCase)
> operators/cross_propagation_right.xmds ... ok
> test_cross_propagation_sic (__main__.main.<locals>.ScriptTestCase)
> operators/cross_propagation_sic.xmds ... ok
> test_cross_propagation_sic_right (__main__.main.<locals>.ScriptTestCase)
> operators/cross_propagation_sic_right.xmds ... ok
> test_highdimcrossprop (__main__.main.<locals>.ScriptTestCase)
> operators/highdimcrossprop.xmds ... ok
> test_nonconstant_complex_ip (__main__.main.<locals>.ScriptTestCase)
> operators/nonconstant_complex_ip.xmds ... ok
> test_nonconstant_complex_ip_2d (__main__.main.<locals>.ScriptTestCase)
> operators/nonconstant_complex_ip_2d.xmds ... ok
> test_nonconstant_complex_separated_ip_2d 
> (__main__.main.<locals>.ScriptTestCase)
> operators/nonconstant_complex_separated_ip_2d.xmds ... ok
> test_nonconstant_double_ip (__main__.main.<locals>.ScriptTestCase)
> operators/nonconstant_double_ip.xmds ... ok
> test_nonconstant_ex (__main__.main.<locals>.ScriptTestCase)
> operators/nonconstant_ex.xmds ... ok
> test_nonconstant_real_ip_2d (__main__.main.<locals>.ScriptTestCase)
> operators/nonconstant_real_ip_2d.xmds ... ok
> test_bessel_neumann_wave_equation (__main__.main.<locals>.ScriptTestCase)
> transforms/bessel_neumann_wave_equation.xmds ... ok
> test_bessel_transform (__main__.main.<locals>.ScriptTestCase)
> transforms/bessel_transform.xmds ... ok
> test_bessel_transform_rectangular (__main__.main.<locals>.ScriptTestCase)
> transforms/bessel_transform_rectangular.xmds ... ok
> test_diffusion_bessel (__main__.main.<locals>.ScriptTestCase)
> transforms/diffusion_bessel.xmds ... ok
> test_diffusion_dst (__main__.main.<locals>.ScriptTestCase)
> transforms/diffusion_dst.xmds ... ok
> test_disc (__main__.main.<locals>.ScriptTestCase)
> transforms/disc.xmds ... ok
> test_hermitegauss_fourier (__main__.main.<locals>.ScriptTestCase)
> transforms/hermitegauss_fourier.xmds ... ok
> test_hermitegauss_fourier_loading (__main__.main.<locals>.ScriptTestCase)
> transforms/hermitegauss_fourier_loading.xmds ... ok
> test_hermitegauss_transform_2d (__main__.main.<locals>.ScriptTestCase)
> transforms/hermitegauss_transform_2d.xmds ... ok
> test_hermitegauss_transform_2d_chunked (__main__.main.<locals>.ScriptTestCase)
> transforms/hermitegauss_transform_2d_chunked.xmds ... ok
> test_spherical_ball (__main__.main.<locals>.ScriptTestCase)
> transforms/spherical_ball.xmds ... ok
> test_vibstring_dct (__main__.main.<locals>.ScriptTestCase)
> transforms/vibstring_dct.xmds ... ok
> test_diffusion_mpi (__main__.main.<locals>.ScriptTestCase)
> mpi/diffusion_mpi.xmds ... FAIL
> test_diffusion_mpi_chunked (__main__.main.<locals>.ScriptTestCase)
> mpi/diffusion_mpi_chunked.xmds ... FAIL
> test_eigenvalues (__main__.main.<locals>.ScriptTestCase)
> mpi/eigenvalues.xmds ... FAIL
> test_fibre_integer_dimensions_mpi (__main__.main.<locals>.ScriptTestCase)
> mpi/fibre_integer_dimensions_mpi.xmds ... FAIL
> test_hermitegauss_transform_2d_mpi (__main__.main.<locals>.ScriptTestCase)
> mpi/hermitegauss_transform_2d_mpi.xmds ... FAIL
> test_hermitegauss_transform_2d_mpi_small 
> (__main__.main.<locals>.ScriptTestCase)
> mpi/hermitegauss_transform_2d_mpi_small.xmds ... FAIL
> test_kubo_adaptive_mpi_paths (__main__.main.<locals>.ScriptTestCase)
> mpi/kubo_adaptive_mpi_paths.xmds ... FAIL
> test_kubo_integer_dimensions_mpi (__main__.main.<locals>.ScriptTestCase)
> mpi/kubo_integer_dimensions_mpi.xmds ... FAIL
> test_kubo_mpi_paths (__main__.main.<locals>.ScriptTestCase)
> mpi/kubo_mpi_paths.xmds ... FAIL
> test_lorenz_mpi (__main__.main.<locals>.ScriptTestCase)
> mpi/lorenz_mpi.xmds ... FAIL
> test_mpi_dft (__main__.main.<locals>.ScriptTestCase)
> mpi/mpi_dft.xmds ... FAIL
> test_mpi_dft_hdf5 (__main__.main.<locals>.ScriptTestCase)
> mpi/mpi_dft_hdf5.xmds ... FAIL
> test_mpi_dft_small (__main__.main.<locals>.ScriptTestCase)
> mpi/mpi_dft_small.xmds ... FAIL
> test_mpi_forward_plan_bug (__main__.main.<locals>.ScriptTestCase)
> mpi/mpi_forward_plan_bug.xmds ... FAIL
> test_mpi_highdimcrossprop (__main__.main.<locals>.ScriptTestCase)
> mpi/mpi_highdimcrossprop.xmds ... FAIL
> test_partial_integration_computed_vector 
> (__main__.main.<locals>.ScriptTestCase)
> mpi/partial_integration_computed_vector.xmds ... FAIL
> test_vibstring_dst_mpi (__main__.main.<locals>.ScriptTestCase)
> mpi/vibstring_dst_mpi.xmds ... FAIL
> test_vibstring_dst_mpi_chunked (__main__.main.<locals>.ScriptTestCase)
> mpi/vibstring_dst_mpi_chunked.xmds ... FAIL
> test_vibstring_mpi_aliases (__main__.main.<locals>.ScriptTestCase)
> mpi/vibstring_mpi_aliases.xmds ... FAIL
> test_integer_dimensions (__main__.main.<locals>.ScriptTestCase)
> geometry/integer_dimensions.xmds ... ok
> test_integer_dimensions_reordered (__main__.main.<locals>.ScriptTestCase)
> geometry/integer_dimensions_reordered.xmds ... ok
> test_nonlocal_access_multiple_components 
> (__main__.main.<locals>.ScriptTestCase)
> geometry/nonlocal_access_multiple_components.xmds ... ok
> test_nonlocal_edge_uniform_access (__main__.main.<locals>.ScriptTestCase)
> geometry/nonlocal_edge_uniform_access.xmds ... ok
> test_nonlocal_index_access (__main__.main.<locals>.ScriptTestCase)
> geometry/nonlocal_index_access.xmds ... ok
> test_nonlocal_negative_uniform_access (__main__.main.<locals>.ScriptTestCase)
> geometry/nonlocal_negative_uniform_access.xmds ... ok
> test_nonlocal_split_uniform_access (__main__.main.<locals>.ScriptTestCase)
> geometry/nonlocal_split_uniform_access.xmds ... ok
> test_bessel_cosine_stochastic_groundstate 
> (__main__.main.<locals>.ScriptTestCase)
> stochastic/bessel_cosine_stochastic_groundstate.xmds ... ok
> test_double_precision_noise_tests (__main__.main.<locals>.ScriptTestCase)
> stochastic/double_precision_noise_tests.xmds ... ok
> test_dsfmt_single_precision (__main__.main.<locals>.ScriptTestCase)
> stochastic/dsfmt_single_precision.xmds ... ok
> test_fibre (__main__.main.<locals>.ScriptTestCase)
> stochastic/fibre.xmds ... ok
> test_fibre_with_correlation_functions (__main__.main.<locals>.ScriptTestCase)
> stochastic/fibre_with_correlation_functions.xmds ... ok
> test_kubo (__main__.main.<locals>.ScriptTestCase)
> stochastic/kubo.xmds ... ok
> test_kubo_fixedstep (__main__.main.<locals>.ScriptTestCase)
> stochastic/kubo_fixedstep.xmds ... ok
> test_photodetector (__main__.main.<locals>.ScriptTestCase)
> stochastic/photodetector.xmds ... ok
> test_photodetector_bessel (__main__.main.<locals>.ScriptTestCase)
> stochastic/photodetector_bessel.xmds ... ok
> test_photodetector_linear (__main__.main.<locals>.ScriptTestCase)
> stochastic/photodetector_linear.xmds ... ok
> test_wigner_cool_HO (__main__.main.<locals>.ScriptTestCase)
> stochastic/wigner_cool_HO.xmds ... ok
> test_bessel_cosine_groundstate (__main__.main.<locals>.ScriptTestCase)
> openmp/bessel_cosine_groundstate.xmds ... ok
> test_diffusion_openmp (__main__.main.<locals>.ScriptTestCase)
> openmp/diffusion_openmp.xmds ... ok
> test_diffusion_openmp_chunked (__main__.main.<locals>.ScriptTestCase)
> openmp/diffusion_openmp_chunked.xmds ... ok
> test_eigenvalues (__main__.main.<locals>.ScriptTestCase)
> openmp/eigenvalues.xmds ... ok
> test_hermitegauss_transform_2d_openmp (__main__.main.<locals>.ScriptTestCase)
> openmp/hermitegauss_transform_2d_openmp.xmds ... ok
> test_hermitegauss_transform_2d_openmp_small 
> (__main__.main.<locals>.ScriptTestCase)
> openmp/hermitegauss_transform_2d_openmp_small.xmds ... ok
> test_kubo_integer_dimensions_openmp (__main__.main.<locals>.ScriptTestCase)
> openmp/kubo_integer_dimensions_openmp.xmds ... ok
> test_lorenz_openmp (__main__.main.<locals>.ScriptTestCase)
> openmp/lorenz_openmp.xmds ... ok
> test_openmp_dft (__main__.main.<locals>.ScriptTestCase)
> openmp/openmp_dft.xmds ... ok
> test_openmp_dft_hdf5 (__main__.main.<locals>.ScriptTestCase)
> openmp/openmp_dft_hdf5.xmds ... ok
> test_openmp_dft_small (__main__.main.<locals>.ScriptTestCase)
> openmp/openmp_dft_small.xmds ... ok
> test_openmp_highdimcrossprop (__main__.main.<locals>.ScriptTestCase)
> openmp/openmp_highdimcrossprop.xmds ... ok
> test_partial_integration_computed_vector 
> (__main__.main.<locals>.ScriptTestCase)
> openmp/partial_integration_computed_vector.xmds ... ok
> test_vibstring_dst_openmp (__main__.main.<locals>.ScriptTestCase)
> openmp/vibstring_dst_openmp.xmds ... ok
> test_vibstring_dst_openmp_chunked (__main__.main.<locals>.ScriptTestCase)
> openmp/vibstring_dst_openmp_chunked.xmds ... ok
> test_breakpoints_hdf5 (__main__.main.<locals>.ScriptTestCase)
> runtime_lattice/breakpoints_hdf5.xmds ... ok
> test_integer_dimensions (__main__.main.<locals>.ScriptTestCase)
> runtime_lattice/integer_dimensions.xmds ... ok
> test_integer_dimensions_with_fixed_lattice 
> (__main__.main.<locals>.ScriptTestCase)
> runtime_lattice/integer_dimensions_with_fixed_lattice.xmds ... ok
> test_integer_dimensions_with_runtime_lattice 
> (__main__.main.<locals>.ScriptTestCase)
> runtime_lattice/integer_dimensions_with_runtime_lattice.xmds ... ok
> test_runtime_lattice_diffusion_dst (__main__.main.<locals>.ScriptTestCase)
> runtime_lattice/runtime_lattice_diffusion_dst.xmds ... ok
> test_runtime_lattice_initialisation_order 
> (__main__.main.<locals>.ScriptTestCase)
> runtime_lattice/runtime_lattice_initialisation_order.xmds ... ok
> test_runtime_lattice_mpi_dft_small (__main__.main.<locals>.ScriptTestCase)
> runtime_lattice/runtime_lattice_mpi_dft_small.xmds ... FAIL
> test_runtime_lattice_nonlocal_split_uniform_access 
> (__main__.main.<locals>.ScriptTestCase)
> runtime_lattice/runtime_lattice_nonlocal_split_uniform_access.xmds ... ok
> test_runtime_lattice_vibstring_ark89 (__main__.main.<locals>.ScriptTestCase)
> runtime_lattice/runtime_lattice_vibstring_ark89.xmds ... ok
> test_runtime_lattice_xsilloading_hdf5_loose2 
> (__main__.main.<locals>.ScriptTestCase)
> runtime_lattice/runtime_lattice_xsilloading_hdf5_loose2.xmds ... ok
> test_initialisation_order (__main__.main.<locals>.ScriptTestCase)
> vectors/initialisation_order.xmds ... ok
> test_initialisation_order_chunked (__main__.main.<locals>.ScriptTestCase)
> vectors/initialisation_order_chunked.xmds ... ok
> test_partial_integration_computed_vector 
> (__main__.main.<locals>.ScriptTestCase)
> vectors/partial_integration_computed_vector.xmds ... ok
> test_breakpoints (__main__.main.<locals>.ScriptTestCase)
> io/breakpoints.xmds ... ok
> test_breakpoints_hdf5 (__main__.main.<locals>.ScriptTestCase)
> io/breakpoints_hdf5.xmds ... ok
> test_mpi_xsilloading_hdf5 (__main__.main.<locals>.ScriptTestCase)
> io/mpi_xsilloading_hdf5.xmds ... FAIL
> test_mpi_xsilloading_hdf5_loose (__main__.main.<locals>.ScriptTestCase)
> io/mpi_xsilloading_hdf5_loose.xmds ... FAIL
> test_mpi_xsilloading_hdf5_loose2 (__main__.main.<locals>.ScriptTestCase)
> io/mpi_xsilloading_hdf5_loose2.xmds ... FAIL
> test_nlse_sampling (__main__.main.<locals>.ScriptTestCase)
> io/nlse_sampling.xmds ... ok
> test_xsilloading_hdf5 (__main__.main.<locals>.ScriptTestCase)
> io/xsilloading_hdf5.xmds ... ok
> test_xsilloading_hdf5_loose (__main__.main.<locals>.ScriptTestCase)
> io/xsilloading_hdf5_loose.xmds ... ok
> test_xsilloading_hdf5_loose2 (__main__.main.<locals>.ScriptTestCase)
> io/xsilloading_hdf5_loose2.xmds ... ok
> test_arguments (__main__.main.<locals>.ScriptTestCase)
> features/arguments.xmds ... ok
> test_arguments_append_args_to_output_filename 
> (__main__.main.<locals>.ScriptTestCase)
> features/arguments_append_args_to_output_filename.xmds ... ok
> test_arguments_with_similar_names (__main__.main.<locals>.ScriptTestCase)
> features/arguments_with_similar_names.xmds ... ok
> test_error_check_multipath (__main__.main.<locals>.ScriptTestCase)
> features/error_check_multipath.xmds ... ok
> test_halt_non_finite (__main__.main.<locals>.ScriptTestCase)
> features/halt_non_finite.xmds ... ok
> test_hermitegauss_transform_2d_chunked_breakpoints 
> (__main__.main.<locals>.ScriptTestCase)
> features/hermitegauss_transform_2d_chunked_breakpoints.xmds ... ok
> test_hermitegauss_transform_2d_chunked_breakpoints_hdf5 
> (__main__.main.<locals>.ScriptTestCase)
> features/hermitegauss_transform_2d_chunked_breakpoints_hdf5.xmds ... ok
> test_realistic_Rb_and_fields (__main__.main.<locals>.ScriptTestCase)
> features/realistic_Rb_and_fields.xmds ... ok
> test_runtime_paths (__main__.main.<locals>.ScriptTestCase)
> features/runtime_paths.xmds ... ok
> test_space in filename (__main__.main.<locals>.ScriptTestCase)
> features/space in filename.xmds ... ok
> test_assignmentToIncorrectVariable 
> (xpdeint.CodeParser.IPOperatorSanityCheckTests) ... ok
> test_basic (xpdeint.CodeParser.IPOperatorSanityCheckTests) ... ok
> test_combined (xpdeint.CodeParser.IPOperatorSanityCheckTests) ... ok
> test_complicatedSafeOperation (xpdeint.CodeParser.IPOperatorSanityCheckTests) 
> ... ok
> test_hiddenUnsafeOperation (xpdeint.CodeParser.IPOperatorSanityCheckTests) 
> ... ok
> test_missingAssignment (xpdeint.CodeParser.IPOperatorSanityCheckTests) ... ok
> test_realExample (xpdeint.CodeParser.IPOperatorSanityCheckTests) ... ok
> test_safeBinaryOperation (xpdeint.CodeParser.IPOperatorSanityCheckTests) ... 
> ok
> test_safeSubtraction (xpdeint.CodeParser.IPOperatorSanityCheckTests) ... ok
> test_unsafeBinaryOperation (xpdeint.CodeParser.IPOperatorSanityCheckTests) 
> ... ok
> test_unsafeSubtraction (xpdeint.CodeParser.IPOperatorSanityCheckTests) ... ok
> test_unsafeUnaryOperation (xpdeint.CodeParser.IPOperatorSanityCheckTests) ... 
> ok
> test_doubleDivisionByInteger (xpdeint.CodeParser.IntegerDivisionTests) ... ok
> test_floatDivision (xpdeint.CodeParser.IntegerDivisionTests) ... ok
> test_ignoreComments (xpdeint.CodeParser.IntegerDivisionTests) ... ok
> test_ignoreStrings (xpdeint.CodeParser.IntegerDivisionTests) ... ok
> test_integerDivision (xpdeint.CodeParser.IntegerDivisionTests) ... ok
> test_integerDivisionByDouble (xpdeint.CodeParser.IntegerDivisionTests) ... ok
> test_symbolDivision (xpdeint.CodeParser.IntegerDivisionTests) ... ok
> test_accessDifferentVariables 
> (xpdeint.CodeParser.NonlocalDimensionAccessForComponentsTests) ... ok
> test_accessMultipleTimes 
> (xpdeint.CodeParser.NonlocalDimensionAccessForComponentsTests) ... ok
> test_basic (xpdeint.CodeParser.NonlocalDimensionAccessForComponentsTests) ... 
> ok
> test_combined (xpdeint.CodeParser.NonlocalDimensionAccessForComponentsTests) 
> ... ok
> test_multipleAccess 
> (xpdeint.CodeParser.NonlocalDimensionAccessForComponentsTests) ... ok
> test_notGreedy (xpdeint.CodeParser.NonlocalDimensionAccessForComponentsTests) 
> ... ok
> test_withPrintf 
> (xpdeint.CodeParser.NonlocalDimensionAccessForComponentsTests) ... ok
> test_combined (xpdeint.CodeParser.TargetComponentsForOperatorsInStringTests) 
> ... ok
> test_ignoreChildComment 
> (xpdeint.CodeParser.TargetComponentsForOperatorsInStringTests) ... ok
> test_ignoreSiblingComment 
> (xpdeint.CodeParser.TargetComponentsForOperatorsInStringTests) ... ok
> test_ignoreSiblingQuotedString 
> (xpdeint.CodeParser.TargetComponentsForOperatorsInStringTests) ... ok
> test_invalidSyntax 
> (xpdeint.CodeParser.TargetComponentsForOperatorsInStringTests) ... ok
> test_nestedOperators 
> (xpdeint.CodeParser.TargetComponentsForOperatorsInStringTests) ... ok
> test_notGreedy (xpdeint.CodeParser.TargetComponentsForOperatorsInStringTests) 
> ... ok
> test_unbalancedString 
> (xpdeint.CodeParser.TargetComponentsForOperatorsInStringTests) ... ok
> test_withPrintf 
> (xpdeint.CodeParser.TargetComponentsForOperatorsInStringTests) ... ok
> 
> ======================================================================
> FAIL: test_RbGSdipoles (__main__.main.<locals>.ScriptTestCase)
> fast/RbGSdipoles.xmds
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 328, in newfunc
>     return func(*(args + fargs), **newkeywords)
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 172, in scriptTestingFunction
>     self.assertTrue(returnCode == 0, "Failed to execute compiled simulation 
> correctly. Got returnCode %(returnCode)i;\nstdout = %(stdout)s;\nstderr = 
> %(stderr)s\n" % locals())
> AssertionError: False is not true : Failed to execute compiled simulation 
> correctly. Got returnCode 1;
> stdout = b'';
> stderr = b"[ip-172-31-1-222:11215] [[18447,0],0] ORTE_ERROR_LOG: Not found in 
> file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 
> 320\n--------------------------------------------------------------------------\nIt
>  looks like orte_init failed for some reason; your parallel process 
> is\nlikely to abort.  There are many reasons that a parallel process 
> can\nfail during orte_init; some of which are due to configuration 
> or\nenvironment problems.  This failure appears to be an internal 
> failure;\nhere's some additional information (which may only be relevant to 
> an\nOpen MPI developer):\n\n  opal_pmix_base_select failed\n  --> Returned 
> value Not found (-13) instead of 
> ORTE_SUCCESS\n--------------------------------------------------------------------------\n"
> 
> 
> ======================================================================
> FAIL: test_cpc_example4 (__main__.main.<locals>.ScriptTestCase)
> fast/cpc_example4.xmds
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 328, in newfunc
>     return func(*(args + fargs), **newkeywords)
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 172, in scriptTestingFunction
>     self.assertTrue(returnCode == 0, "Failed to execute compiled simulation 
> correctly. Got returnCode %(returnCode)i;\nstdout = %(stdout)s;\nstderr = 
> %(stderr)s\n" % locals())
> AssertionError: False is not true : Failed to execute compiled simulation 
> correctly. Got returnCode 1;
> stdout = b'';
> stderr = b"[ip-172-31-1-222:11268] [[20420,0],0] ORTE_ERROR_LOG: Not found in 
> file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 
> 320\n--------------------------------------------------------------------------\nIt
>  looks like orte_init failed for some reason; your parallel process 
> is\nlikely to abort.  There are many reasons that a parallel process 
> can\nfail during orte_init; some of which are due to configuration 
> or\nenvironment problems.  This failure appears to be an internal 
> failure;\nhere's some additional information (which may only be relevant to 
> an\nOpen MPI developer):\n\n  opal_pmix_base_select failed\n  --> Returned 
> value Not found (-13) instead of 
> ORTE_SUCCESS\n--------------------------------------------------------------------------\n"
> 
> 
> ======================================================================
> FAIL: test_diffusion_mpi (__main__.main.<locals>.ScriptTestCase)
> mpi/diffusion_mpi.xmds
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 328, in newfunc
>     return func(*(args + fargs), **newkeywords)
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 172, in scriptTestingFunction
>     self.assertTrue(returnCode == 0, "Failed to execute compiled simulation 
> correctly. Got returnCode %(returnCode)i;\nstdout = %(stdout)s;\nstderr = 
> %(stderr)s\n" % locals())
> AssertionError: False is not true : Failed to execute compiled simulation 
> correctly. Got returnCode 1;
> stdout = b'';
> stderr = b"[ip-172-31-1-222:11950] [[19822,0],0] ORTE_ERROR_LOG: Not found in 
> file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 
> 320\n--------------------------------------------------------------------------\nIt
>  looks like orte_init failed for some reason; your parallel process 
> is\nlikely to abort.  There are many reasons that a parallel process 
> can\nfail during orte_init; some of which are due to configuration 
> or\nenvironment problems.  This failure appears to be an internal 
> failure;\nhere's some additional information (which may only be relevant to 
> an\nOpen MPI developer):\n\n  opal_pmix_base_select failed\n  --> Returned 
> value Not found (-13) instead of 
> ORTE_SUCCESS\n--------------------------------------------------------------------------\n"
> 
> 
> ======================================================================
> FAIL: test_diffusion_mpi_chunked (__main__.main.<locals>.ScriptTestCase)
> mpi/diffusion_mpi_chunked.xmds
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 328, in newfunc
>     return func(*(args + fargs), **newkeywords)
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 172, in scriptTestingFunction
>     self.assertTrue(returnCode == 0, "Failed to execute compiled simulation 
> correctly. Got returnCode %(returnCode)i;\nstdout = %(stdout)s;\nstderr = 
> %(stderr)s\n" % locals())
> AssertionError: False is not true : Failed to execute compiled simulation 
> correctly. Got returnCode 1;
> stdout = b'';
> stderr = b"[ip-172-31-1-222:11969] [[19713,0],0] ORTE_ERROR_LOG: Not found in 
> file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 
> 320\n--------------------------------------------------------------------------\nIt
>  looks like orte_init failed for some reason; your parallel process 
> is\nlikely to abort.  There are many reasons that a parallel process 
> can\nfail during orte_init; some of which are due to configuration 
> or\nenvironment problems.  This failure appears to be an internal 
> failure;\nhere's some additional information (which may only be relevant to 
> an\nOpen MPI developer):\n\n  opal_pmix_base_select failed\n  --> Returned 
> value Not found (-13) instead of 
> ORTE_SUCCESS\n--------------------------------------------------------------------------\n"
> 
> 
> ======================================================================
> FAIL: test_eigenvalues (__main__.main.<locals>.ScriptTestCase)
> mpi/eigenvalues.xmds
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 328, in newfunc
>     return func(*(args + fargs), **newkeywords)
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 172, in scriptTestingFunction
>     self.assertTrue(returnCode == 0, "Failed to execute compiled simulation 
> correctly. Got returnCode %(returnCode)i;\nstdout = %(stdout)s;\nstderr = 
> %(stderr)s\n" % locals())
> AssertionError: False is not true : Failed to execute compiled simulation 
> correctly. Got returnCode 1;
> stdout = b'';
> stderr = b"[ip-172-31-1-222:11988] [[19732,0],0] ORTE_ERROR_LOG: Not found in 
> file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 
> 320\n--------------------------------------------------------------------------\nIt
>  looks like orte_init failed for some reason; your parallel process 
> is\nlikely to abort.  There are many reasons that a parallel process 
> can\nfail during orte_init; some of which are due to configuration 
> or\nenvironment problems.  This failure appears to be an internal 
> failure;\nhere's some additional information (which may only be relevant to 
> an\nOpen MPI developer):\n\n  opal_pmix_base_select failed\n  --> Returned 
> value Not found (-13) instead of 
> ORTE_SUCCESS\n--------------------------------------------------------------------------\n"
> 
> 
> ======================================================================
> FAIL: test_fibre_integer_dimensions_mpi 
> (__main__.main.<locals>.ScriptTestCase)
> mpi/fibre_integer_dimensions_mpi.xmds
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 328, in newfunc
>     return func(*(args + fargs), **newkeywords)
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 172, in scriptTestingFunction
>     self.assertTrue(returnCode == 0, "Failed to execute compiled simulation 
> correctly. Got returnCode %(returnCode)i;\nstdout = %(stdout)s;\nstderr = 
> %(stderr)s\n" % locals())
> AssertionError: False is not true : Failed to execute compiled simulation 
> correctly. Got returnCode 1;
> stdout = b'';
> stderr = b"[ip-172-31-1-222:12007] [[19751,0],0] ORTE_ERROR_LOG: Not found in 
> file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 
> 320\n--------------------------------------------------------------------------\nIt
>  looks like orte_init failed for some reason; your parallel process 
> is\nlikely to abort.  There are many reasons that a parallel process 
> can\nfail during orte_init; some of which are due to configuration 
> or\nenvironment problems.  This failure appears to be an internal 
> failure;\nhere's some additional information (which may only be relevant to 
> an\nOpen MPI developer):\n\n  opal_pmix_base_select failed\n  --> Returned 
> value Not found (-13) instead of 
> ORTE_SUCCESS\n--------------------------------------------------------------------------\n"
> 
> 
> ======================================================================
> FAIL: test_hermitegauss_transform_2d_mpi 
> (__main__.main.<locals>.ScriptTestCase)
> mpi/hermitegauss_transform_2d_mpi.xmds
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 328, in newfunc
>     return func(*(args + fargs), **newkeywords)
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 172, in scriptTestingFunction
>     self.assertTrue(returnCode == 0, "Failed to execute compiled simulation 
> correctly. Got returnCode %(returnCode)i;\nstdout = %(stdout)s;\nstderr = 
> %(stderr)s\n" % locals())
> AssertionError: False is not true : Failed to execute compiled simulation 
> correctly. Got returnCode 1;
> stdout = b'';
> stderr = b"[ip-172-31-1-222:12026] [[19770,0],0] ORTE_ERROR_LOG: Not found in 
> file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 
> 320\n--------------------------------------------------------------------------\nIt
>  looks like orte_init failed for some reason; your parallel process 
> is\nlikely to abort.  There are many reasons that a parallel process 
> can\nfail during orte_init; some of which are due to configuration 
> or\nenvironment problems.  This failure appears to be an internal 
> failure;\nhere's some additional information (which may only be relevant to 
> an\nOpen MPI developer):\n\n  opal_pmix_base_select failed\n  --> Returned 
> value Not found (-13) instead of 
> ORTE_SUCCESS\n--------------------------------------------------------------------------\n"
> 
> 
> ======================================================================
> FAIL: test_hermitegauss_transform_2d_mpi_small 
> (__main__.main.<locals>.ScriptTestCase)
> mpi/hermitegauss_transform_2d_mpi_small.xmds
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 328, in newfunc
>     return func(*(args + fargs), **newkeywords)
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 172, in scriptTestingFunction
>     self.assertTrue(returnCode == 0, "Failed to execute compiled simulation 
> correctly. Got returnCode %(returnCode)i;\nstdout = %(stdout)s;\nstderr = 
> %(stderr)s\n" % locals())
> AssertionError: False is not true : Failed to execute compiled simulation 
> correctly. Got returnCode 1;
> stdout = b'';
> stderr = b"[ip-172-31-1-222:12045] [[19661,0],0] ORTE_ERROR_LOG: Not found in 
> file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 
> 320\n--------------------------------------------------------------------------\nIt
>  looks like orte_init failed for some reason; your parallel process 
> is\nlikely to abort.  There are many reasons that a parallel process 
> can\nfail during orte_init; some of which are due to configuration 
> or\nenvironment problems.  This failure appears to be an internal 
> failure;\nhere's some additional information (which may only be relevant to 
> an\nOpen MPI developer):\n\n  opal_pmix_base_select failed\n  --> Returned 
> value Not found (-13) instead of 
> ORTE_SUCCESS\n--------------------------------------------------------------------------\n"
> 
> 
> ======================================================================
> FAIL: test_kubo_adaptive_mpi_paths (__main__.main.<locals>.ScriptTestCase)
> mpi/kubo_adaptive_mpi_paths.xmds
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 328, in newfunc
>     return func(*(args + fargs), **newkeywords)
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 172, in scriptTestingFunction
>     self.assertTrue(returnCode == 0, "Failed to execute compiled simulation 
> correctly. Got returnCode %(returnCode)i;\nstdout = %(stdout)s;\nstderr = 
> %(stderr)s\n" % locals())
> AssertionError: False is not true : Failed to execute compiled simulation 
> correctly. Got returnCode 1;
> stdout = b'';
> stderr = b"[ip-172-31-1-222:12064] [[19680,0],0] ORTE_ERROR_LOG: Not found in 
> file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 
> 320\n--------------------------------------------------------------------------\nIt
>  looks like orte_init failed for some reason; your parallel process 
> is\nlikely to abort.  There are many reasons that a parallel process 
> can\nfail during orte_init; some of which are due to configuration 
> or\nenvironment problems.  This failure appears to be an internal 
> failure;\nhere's some additional information (which may only be relevant to 
> an\nOpen MPI developer):\n\n  opal_pmix_base_select failed\n  --> Returned 
> value Not found (-13) instead of 
> ORTE_SUCCESS\n--------------------------------------------------------------------------\n"
> 
> 
> ======================================================================
> FAIL: test_kubo_integer_dimensions_mpi (__main__.main.<locals>.ScriptTestCase)
> mpi/kubo_integer_dimensions_mpi.xmds
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 328, in newfunc
>     return func(*(args + fargs), **newkeywords)
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 172, in scriptTestingFunction
>     self.assertTrue(returnCode == 0, "Failed to execute compiled simulation 
> correctly. Got returnCode %(returnCode)i;\nstdout = %(stdout)s;\nstderr = 
> %(stderr)s\n" % locals())
> AssertionError: False is not true : Failed to execute compiled simulation 
> correctly. Got returnCode 1;
> stdout = b'';
> stderr = b"[ip-172-31-1-222:12083] [[19699,0],0] ORTE_ERROR_LOG: Not found in 
> file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 
> 320\n--------------------------------------------------------------------------\nIt
>  looks like orte_init failed for some reason; your parallel process 
> is\nlikely to abort.  There are many reasons that a parallel process 
> can\nfail during orte_init; some of which are due to configuration 
> or\nenvironment problems.  This failure appears to be an internal 
> failure;\nhere's some additional information (which may only be relevant to 
> an\nOpen MPI developer):\n\n  opal_pmix_base_select failed\n  --> Returned 
> value Not found (-13) instead of 
> ORTE_SUCCESS\n--------------------------------------------------------------------------\n"
> 
> 
> ======================================================================
> FAIL: test_kubo_mpi_paths (__main__.main.<locals>.ScriptTestCase)
> mpi/kubo_mpi_paths.xmds
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 328, in newfunc
>     return func(*(args + fargs), **newkeywords)
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 172, in scriptTestingFunction
>     self.assertTrue(returnCode == 0, "Failed to execute compiled simulation 
> correctly. Got returnCode %(returnCode)i;\nstdout = %(stdout)s;\nstderr = 
> %(stderr)s\n" % locals())
> AssertionError: False is not true : Failed to execute compiled simulation 
> correctly. Got returnCode 1;
> stdout = b'';
> stderr = b"[ip-172-31-1-222:12102] [[19590,0],0] ORTE_ERROR_LOG: Not found in 
> file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 
> 320\n--------------------------------------------------------------------------\nIt
>  looks like orte_init failed for some reason; your parallel process 
> is\nlikely to abort.  There are many reasons that a parallel process 
> can\nfail during orte_init; some of which are due to configuration 
> or\nenvironment problems.  This failure appears to be an internal 
> failure;\nhere's some additional information (which may only be relevant to 
> an\nOpen MPI developer):\n\n  opal_pmix_base_select failed\n  --> Returned 
> value Not found (-13) instead of 
> ORTE_SUCCESS\n--------------------------------------------------------------------------\n"
> 
> 
> ======================================================================
> FAIL: test_lorenz_mpi (__main__.main.<locals>.ScriptTestCase)
> mpi/lorenz_mpi.xmds
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 328, in newfunc
>     return func(*(args + fargs), **newkeywords)
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 172, in scriptTestingFunction
>     self.assertTrue(returnCode == 0, "Failed to execute compiled simulation 
> correctly. Got returnCode %(returnCode)i;\nstdout = %(stdout)s;\nstderr = 
> %(stderr)s\n" % locals())
> AssertionError: False is not true : Failed to execute compiled simulation 
> correctly. Got returnCode 1;
> stdout = b'';
> stderr = b"[ip-172-31-1-222:12121] [[19609,0],0] ORTE_ERROR_LOG: Not found in 
> file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 
> 320\n--------------------------------------------------------------------------\nIt
>  looks like orte_init failed for some reason; your parallel process 
> is\nlikely to abort.  There are many reasons that a parallel process 
> can\nfail during orte_init; some of which are due to configuration 
> or\nenvironment problems.  This failure appears to be an internal 
> failure;\nhere's some additional information (which may only be relevant to 
> an\nOpen MPI developer):\n\n  opal_pmix_base_select failed\n  --> Returned 
> value Not found (-13) instead of 
> ORTE_SUCCESS\n--------------------------------------------------------------------------\n"
> 
> 
> ======================================================================
> FAIL: test_mpi_dft (__main__.main.<locals>.ScriptTestCase)
> mpi/mpi_dft.xmds
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 328, in newfunc
>     return func(*(args + fargs), **newkeywords)
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 172, in scriptTestingFunction
>     self.assertTrue(returnCode == 0, "Failed to execute compiled simulation 
> correctly. Got returnCode %(returnCode)i;\nstdout = %(stdout)s;\nstderr = 
> %(stderr)s\n" % locals())
> AssertionError: False is not true : Failed to execute compiled simulation 
> correctly. Got returnCode 1;
> stdout = b'';
> stderr = b"[ip-172-31-1-222:12140] [[19628,0],0] ORTE_ERROR_LOG: Not found in 
> file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 
> 320\n--------------------------------------------------------------------------\nIt
>  looks like orte_init failed for some reason; your parallel process 
> is\nlikely to abort.  There are many reasons that a parallel process 
> can\nfail during orte_init; some of which are due to configuration 
> or\nenvironment problems.  This failure appears to be an internal 
> failure;\nhere's some additional information (which may only be relevant to 
> an\nOpen MPI developer):\n\n  opal_pmix_base_select failed\n  --> Returned 
> value Not found (-13) instead of 
> ORTE_SUCCESS\n--------------------------------------------------------------------------\n"
> 
> 
> ======================================================================
> FAIL: test_mpi_dft_hdf5 (__main__.main.<locals>.ScriptTestCase)
> mpi/mpi_dft_hdf5.xmds
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 328, in newfunc
>     return func(*(args + fargs), **newkeywords)
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 172, in scriptTestingFunction
>     self.assertTrue(returnCode == 0, "Failed to execute compiled simulation 
> correctly. Got returnCode %(returnCode)i;\nstdout = %(stdout)s;\nstderr = 
> %(stderr)s\n" % locals())
> AssertionError: False is not true : Failed to execute compiled simulation 
> correctly. Got returnCode 1;
> stdout = b'';
> stderr = b"[ip-172-31-1-222:12159] [[19647,0],0] ORTE_ERROR_LOG: Not found in 
> file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 
> 320\n--------------------------------------------------------------------------\nIt
>  looks like orte_init failed for some reason; your parallel process 
> is\nlikely to abort.  There are many reasons that a parallel process 
> can\nfail during orte_init; some of which are due to configuration 
> or\nenvironment problems.  This failure appears to be an internal 
> failure;\nhere's some additional information (which may only be relevant to 
> an\nOpen MPI developer):\n\n  opal_pmix_base_select failed\n  --> Returned 
> value Not found (-13) instead of 
> ORTE_SUCCESS\n--------------------------------------------------------------------------\n"
> 
> 
> ======================================================================
> FAIL: test_mpi_dft_small (__main__.main.<locals>.ScriptTestCase)
> mpi/mpi_dft_small.xmds
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 328, in newfunc
>     return func(*(args + fargs), **newkeywords)
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 172, in scriptTestingFunction
>     self.assertTrue(returnCode == 0, "Failed to execute compiled simulation 
> correctly. Got returnCode %(returnCode)i;\nstdout = %(stdout)s;\nstderr = 
> %(stderr)s\n" % locals())
> AssertionError: False is not true : Failed to execute compiled simulation 
> correctly. Got returnCode 1;
> stdout = b'';
> stderr = b"[ip-172-31-1-222:12178] [[19538,0],0] ORTE_ERROR_LOG: Not found in 
> file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 
> 320\n--------------------------------------------------------------------------\nIt
>  looks like orte_init failed for some reason; your parallel process 
> is\nlikely to abort.  There are many reasons that a parallel process 
> can\nfail during orte_init; some of which are due to configuration 
> or\nenvironment problems.  This failure appears to be an internal 
> failure;\nhere's some additional information (which may only be relevant to 
> an\nOpen MPI developer):\n\n  opal_pmix_base_select failed\n  --> Returned 
> value Not found (-13) instead of 
> ORTE_SUCCESS\n--------------------------------------------------------------------------\n"
> 
> 
> ======================================================================
> FAIL: test_mpi_forward_plan_bug (__main__.main.<locals>.ScriptTestCase)
> mpi/mpi_forward_plan_bug.xmds
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 328, in newfunc
>     return func(*(args + fargs), **newkeywords)
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 172, in scriptTestingFunction
>     self.assertTrue(returnCode == 0, "Failed to execute compiled simulation 
> correctly. Got returnCode %(returnCode)i;\nstdout = %(stdout)s;\nstderr = 
> %(stderr)s\n" % locals())
> AssertionError: False is not true : Failed to execute compiled simulation 
> correctly. Got returnCode 1;
> stdout = b'';
> stderr = b"[ip-172-31-1-222:12197] [[19557,0],0] ORTE_ERROR_LOG: Not found in 
> file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 
> 320\n--------------------------------------------------------------------------\nIt
>  looks like orte_init failed for some reason; your parallel process 
> is\nlikely to abort.  There are many reasons that a parallel process 
> can\nfail during orte_init; some of which are due to configuration 
> or\nenvironment problems.  This failure appears to be an internal 
> failure;\nhere's some additional information (which may only be relevant to 
> an\nOpen MPI developer):\n\n  opal_pmix_base_select failed\n  --> Returned 
> value Not found (-13) instead of 
> ORTE_SUCCESS\n--------------------------------------------------------------------------\n"
> 
> 
> ======================================================================
> FAIL: test_mpi_highdimcrossprop (__main__.main.<locals>.ScriptTestCase)
> mpi/mpi_highdimcrossprop.xmds
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 328, in newfunc
>     return func(*(args + fargs), **newkeywords)
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 172, in scriptTestingFunction
>     self.assertTrue(returnCode == 0, "Failed to execute compiled simulation 
> correctly. Got returnCode %(returnCode)i;\nstdout = %(stdout)s;\nstderr = 
> %(stderr)s\n" % locals())
> AssertionError: False is not true : Failed to execute compiled simulation 
> correctly. Got returnCode 1;
> stdout = b'';
> stderr = b"[ip-172-31-1-222:12216] [[19576,0],0] ORTE_ERROR_LOG: Not found in 
> file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 
> 320\n--------------------------------------------------------------------------\nIt
>  looks like orte_init failed for some reason; your parallel process 
> is\nlikely to abort.  There are many reasons that a parallel process 
> can\nfail during orte_init; some of which are due to configuration 
> or\nenvironment problems.  This failure appears to be an internal 
> failure;\nhere's some additional information (which may only be relevant to 
> an\nOpen MPI developer):\n\n  opal_pmix_base_select failed\n  --> Returned 
> value Not found (-13) instead of 
> ORTE_SUCCESS\n--------------------------------------------------------------------------\n"
> 
> 
> ======================================================================
> FAIL: test_partial_integration_computed_vector 
> (__main__.main.<locals>.ScriptTestCase)
> mpi/partial_integration_computed_vector.xmds
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 328, in newfunc
>     return func(*(args + fargs), **newkeywords)
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 172, in scriptTestingFunction
>     self.assertTrue(returnCode == 0, "Failed to execute compiled simulation 
> correctly. Got returnCode %(returnCode)i;\nstdout = %(stdout)s;\nstderr = 
> %(stderr)s\n" % locals())
> AssertionError: False is not true : Failed to execute compiled simulation 
> correctly. Got returnCode 1;
> stdout = b'';
> stderr = b"[ip-172-31-1-222:12235] [[19467,0],0] ORTE_ERROR_LOG: Not found in 
> file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 
> 320\n--------------------------------------------------------------------------\nIt
>  looks like orte_init failed for some reason; your parallel process 
> is\nlikely to abort.  There are many reasons that a parallel process 
> can\nfail during orte_init; some of which are due to configuration 
> or\nenvironment problems.  This failure appears to be an internal 
> failure;\nhere's some additional information (which may only be relevant to 
> an\nOpen MPI developer):\n\n  opal_pmix_base_select failed\n  --> Returned 
> value Not found (-13) instead of 
> ORTE_SUCCESS\n--------------------------------------------------------------------------\n"
> 
> 
> ======================================================================
> FAIL: test_vibstring_dst_mpi (__main__.main.<locals>.ScriptTestCase)
> mpi/vibstring_dst_mpi.xmds
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 328, in newfunc
>     return func(*(args + fargs), **newkeywords)
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 172, in scriptTestingFunction
>     self.assertTrue(returnCode == 0, "Failed to execute compiled simulation 
> correctly. Got returnCode %(returnCode)i;\nstdout = %(stdout)s;\nstderr = 
> %(stderr)s\n" % locals())
> AssertionError: False is not true : Failed to execute compiled simulation 
> correctly. Got returnCode 1;
> stdout = b'';
> stderr = b"[ip-172-31-1-222:12254] [[19486,0],0] ORTE_ERROR_LOG: Not found in 
> file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 
> 320\n--------------------------------------------------------------------------\nIt
>  looks like orte_init failed for some reason; your parallel process 
> is\nlikely to abort.  There are many reasons that a parallel process 
> can\nfail during orte_init; some of which are due to configuration 
> or\nenvironment problems.  This failure appears to be an internal 
> failure;\nhere's some additional information (which may only be relevant to 
> an\nOpen MPI developer):\n\n  opal_pmix_base_select failed\n  --> Returned 
> value Not found (-13) instead of 
> ORTE_SUCCESS\n--------------------------------------------------------------------------\n"
> 
> 
> ======================================================================
> FAIL: test_vibstring_dst_mpi_chunked (__main__.main.<locals>.ScriptTestCase)
> mpi/vibstring_dst_mpi_chunked.xmds
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 328, in newfunc
>     return func(*(args + fargs), **newkeywords)
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 172, in scriptTestingFunction
>     self.assertTrue(returnCode == 0, "Failed to execute compiled simulation 
> correctly. Got returnCode %(returnCode)i;\nstdout = %(stdout)s;\nstderr = 
> %(stderr)s\n" % locals())
> AssertionError: False is not true : Failed to execute compiled simulation 
> correctly. Got returnCode 1;
> stdout = b'';
> stderr = b"[ip-172-31-1-222:12273] [[19505,0],0] ORTE_ERROR_LOG: Not found in 
> file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 
> 320\n--------------------------------------------------------------------------\nIt
>  looks like orte_init failed for some reason; your parallel process 
> is\nlikely to abort.  There are many reasons that a parallel process 
> can\nfail during orte_init; some of which are due to configuration 
> or\nenvironment problems.  This failure appears to be an internal 
> failure;\nhere's some additional information (which may only be relevant to 
> an\nOpen MPI developer):\n\n  opal_pmix_base_select failed\n  --> Returned 
> value Not found (-13) instead of 
> ORTE_SUCCESS\n--------------------------------------------------------------------------\n"
> 
> 
> ======================================================================
> FAIL: test_vibstring_mpi_aliases (__main__.main.<locals>.ScriptTestCase)
> mpi/vibstring_mpi_aliases.xmds
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 328, in newfunc
>     return func(*(args + fargs), **newkeywords)
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 172, in scriptTestingFunction
>     self.assertTrue(returnCode == 0, "Failed to execute compiled simulation 
> correctly. Got returnCode %(returnCode)i;\nstdout = %(stdout)s;\nstderr = 
> %(stderr)s\n" % locals())
> AssertionError: False is not true : Failed to execute compiled simulation 
> correctly. Got returnCode 1;
> stdout = b'';
> stderr = b"[ip-172-31-1-222:12292] [[21444,0],0] ORTE_ERROR_LOG: Not found in 
> file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 
> 320\n--------------------------------------------------------------------------\nIt
>  looks like orte_init failed for some reason; your parallel process 
> is\nlikely to abort.  There are many reasons that a parallel process 
> can\nfail during orte_init; some of which are due to configuration 
> or\nenvironment problems.  This failure appears to be an internal 
> failure;\nhere's some additional information (which may only be relevant to 
> an\nOpen MPI developer):\n\n  opal_pmix_base_select failed\n  --> Returned 
> value Not found (-13) instead of 
> ORTE_SUCCESS\n--------------------------------------------------------------------------\n"
> 
> 
> ======================================================================
> FAIL: test_runtime_lattice_mpi_dft_small 
> (__main__.main.<locals>.ScriptTestCase)
> runtime_lattice/runtime_lattice_mpi_dft_small.xmds
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 328, in newfunc
>     return func(*(args + fargs), **newkeywords)
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 172, in scriptTestingFunction
>     self.assertTrue(returnCode == 0, "Failed to execute compiled simulation 
> correctly. Got returnCode %(returnCode)i;\nstdout = %(stdout)s;\nstderr = 
> %(stderr)s\n" % locals())
> AssertionError: False is not true : Failed to execute compiled simulation 
> correctly. Got returnCode 1;
> stdout = b'';
> stderr = b"[ip-172-31-1-222:13021] [[20765,0],0] ORTE_ERROR_LOG: Not found in 
> file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 
> 320\n--------------------------------------------------------------------------\nIt
>  looks like orte_init failed for some reason; your parallel process 
> is\nlikely to abort.  There are many reasons that a parallel process 
> can\nfail during orte_init; some of which are due to configuration 
> or\nenvironment problems.  This failure appears to be an internal 
> failure;\nhere's some additional information (which may only be relevant to 
> an\nOpen MPI developer):\n\n  opal_pmix_base_select failed\n  --> Returned 
> value Not found (-13) instead of 
> ORTE_SUCCESS\n--------------------------------------------------------------------------\n"
> 
> 
> ======================================================================
> FAIL: test_mpi_xsilloading_hdf5 (__main__.main.<locals>.ScriptTestCase)
> io/mpi_xsilloading_hdf5.xmds
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 328, in newfunc
>     return func(*(args + fargs), **newkeywords)
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 172, in scriptTestingFunction
>     self.assertTrue(returnCode == 0, "Failed to execute compiled simulation 
> correctly. Got returnCode %(returnCode)i;\nstdout = %(stdout)s;\nstderr = 
> %(stderr)s\n" % locals())
> AssertionError: False is not true : Failed to execute compiled simulation 
> correctly. Got returnCode 1;
> stdout = b'';
> stderr = b"[ip-172-31-1-222:13176] [[20664,0],0] ORTE_ERROR_LOG: Not found in 
> file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 
> 320\n--------------------------------------------------------------------------\nIt
>  looks like orte_init failed for some reason; your parallel process 
> is\nlikely to abort.  There are many reasons that a parallel process 
> can\nfail during orte_init; some of which are due to configuration 
> or\nenvironment problems.  This failure appears to be an internal 
> failure;\nhere's some additional information (which may only be relevant to 
> an\nOpen MPI developer):\n\n  opal_pmix_base_select failed\n  --> Returned 
> value Not found (-13) instead of 
> ORTE_SUCCESS\n--------------------------------------------------------------------------\n"
> 
> 
> ======================================================================
> FAIL: test_mpi_xsilloading_hdf5_loose (__main__.main.<locals>.ScriptTestCase)
> io/mpi_xsilloading_hdf5_loose.xmds
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 328, in newfunc
>     return func(*(args + fargs), **newkeywords)
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 172, in scriptTestingFunction
>     self.assertTrue(returnCode == 0, "Failed to execute compiled simulation 
> correctly. Got returnCode %(returnCode)i;\nstdout = %(stdout)s;\nstderr = 
> %(stderr)s\n" % locals())
> AssertionError: False is not true : Failed to execute compiled simulation 
> correctly. Got returnCode 1;
> stdout = b'';
> stderr = b"[ip-172-31-1-222:13195] [[20555,0],0] ORTE_ERROR_LOG: Not found in 
> file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 
> 320\n--------------------------------------------------------------------------\nIt
>  looks like orte_init failed for some reason; your parallel process 
> is\nlikely to abort.  There are many reasons that a parallel process 
> can\nfail during orte_init; some of which are due to configuration 
> or\nenvironment problems.  This failure appears to be an internal 
> failure;\nhere's some additional information (which may only be relevant to 
> an\nOpen MPI developer):\n\n  opal_pmix_base_select failed\n  --> Returned 
> value Not found (-13) instead of 
> ORTE_SUCCESS\n--------------------------------------------------------------------------\n"
> 
> 
> ======================================================================
> FAIL: test_mpi_xsilloading_hdf5_loose2 (__main__.main.<locals>.ScriptTestCase)
> io/mpi_xsilloading_hdf5_loose2.xmds
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 328, in newfunc
>     return func(*(args + fargs), **newkeywords)
>   File "/<<PKGBUILDDIR>>/./run_tests.py", line 172, in scriptTestingFunction
>     self.assertTrue(returnCode == 0, "Failed to execute compiled simulation 
> correctly. Got returnCode %(returnCode)i;\nstdout = %(stdout)s;\nstderr = 
> %(stderr)s\n" % locals())
> AssertionError: False is not true : Failed to execute compiled simulation 
> correctly. Got returnCode 1;
> stdout = b'';
> stderr = b"[ip-172-31-1-222:13214] [[20574,0],0] ORTE_ERROR_LOG: Not found in 
> file ../../../../../../orte/mca/ess/hnp/ess_hnp_module.c at line 
> 320\n--------------------------------------------------------------------------\nIt
>  looks like orte_init failed for some reason; your parallel process 
> is\nlikely to abort.  There are many reasons that a parallel process 
> can\nfail during orte_init; some of which are due to configuration 
> or\nenvironment problems.  This failure appears to be an internal 
> failure;\nhere's some additional information (which may only be relevant to 
> an\nOpen MPI developer):\n\n  opal_pmix_base_select failed\n  --> Returned 
> value Not found (-13) instead of 
> ORTE_SUCCESS\n--------------------------------------------------------------------------\n"
> 
> 
> ----------------------------------------------------------------------
> Ran 172 tests in 406.785s
> 
> FAILED (failures=25)
> Saving test results in /<<PKGBUILDDIR>>/testsuite_results
> make[1]: *** [debian/rules:15: override_dh_auto_test] Error 1

The full build log is available from:
   http://qa-logs.debian.net/2020/12/26/xmds2_3.0.0+dfsg-4_unstable.log

A list of current common problems and possible solutions is available at
http://wiki.debian.org/qa.debian.org/FTBFS . You're welcome to contribute!

If you reassign this bug to another package, please marking it as 'affects'-ing
this package. See https://www.debian.org/Bugs/server-control#affects

If you fail to reproduce this, please provide a build log and diff it with me
so that we can identify if something relevant changed in the meantime.

About the archive rebuild: The rebuild was done on EC2 VM instances from
Amazon Web Services, using a clean, minimal and up-to-date chroot. Every
failed build was retried once to eliminate random failures.

--- End Message ---
--- Begin Message ---
Hi,

This failure was caused by a bug in openmpi , fixed in openmpi 4.1.0-2.
so I'm closing this bug.

Lucas

--- End Message ---

Reply via email to