Your message dated Sat, 04 May 2024 18:03:56 +0200
with message-id <9752a5007e505b40ac4ff58c858c3...@debian.org>
and subject line Re: Bug#1069472 mpi4py-fft: FTBFS on armhf: tests fail
has caused the Debian Bug report #1069472,
regarding mpi4py-fft: FTBFS on armhf: tests fail
to be marked as done.

This means that you claim that the problem has been dealt with.
If this is not the case it is now your responsibility to reopen the
Bug report if necessary, and/or fix the problem forthwith.

(NB: If you are a system administrator and have no idea what this
message is talking about, this may indicate a serious mail system
misconfiguration somewhere. Please contact ow...@bugs.debian.org
immediately.)


-- 
1069472: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1069472
Debian Bug Tracking System
Contact ow...@bugs.debian.org with problems
--- Begin Message ---
Source: mpi4py-fft
Version: 2.0.5-2
Severity: serious
Justification: FTBFS
Tags: trixie sid ftbfs
User: lu...@debian.org
Usertags: ftbfs-20240420 ftbfs-trixie ftbfs-t64-armhf

Hi,

During a rebuild of all packages in sid, your package failed to build
on armhf.


Relevant part (hopefully):
> make[2]: Entering directory '/<<PKGBUILDDIR>>/build/texinfo'
> makeinfo --no-split -o 'mpi4py-fft.info' 'mpi4py-fft.texi'
> mpi4py-fft.texi:1706: warning: could not find @image file 
> `mpi4py-fft-figures//<<PKGBUILDDIR>>/build/texinfo/.doctrees/images/feaa82f44023d4f401d0d133eb689f35762c6507/mpi4py-fft.txt'
>  nor alternate text
> make[2]: Leaving directory '/<<PKGBUILDDIR>>/build/texinfo'
> sed 
> "s|src=\"\(.*\).png\"|src=\"/usr/share/doc/python3-mpi4py-fft/html/_images/\1.png\"|g"
>  -i build/texinfo/mpi4py-fft.info
> sed "s|src=\"\(.*\).svg\"|src=\"\"|g" -i build/texinfo/mpi4py-fft.info
> sed "s|alt=\"Documentation Status\" 
> src=\"https://readthedocs.org/projects/mpi4py-fft/badge/?version=latest\";|alt=\"Latest
>  Documentation\" src=\"\"|" -i build/html/*.html
> sed "s|src=\"https://circleci.com.*svg\";|src=\"\"|" -i build/html/*.html
> make[1]: Leaving directory '/<<PKGBUILDDIR>>'
>    dh_auto_test -O--buildsystem=pybuild
> I: pybuild base:311: cd 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_mpi4py-fft/build; python3.12 -m 
> unittest discover -v 
> --------------------------------------------------------------------------
> Sorry!  You were supposed to get help about:
>     pmix_init:startup:internal-failure
> But I couldn't open the help file:
>     /usr/share/pmix/help-pmix-runtime.txt: No such file or directory.  Sorry!
> --------------------------------------------------------------------------
> [ip-10-84-234-21:102862] PMIX ERROR: NOT-FOUND in file 
> ../../../../../../../../opal/mca/pmix/pmix3x/pmix/src/server/pmix_server.c at 
> line 237
> [ip-10-84-234-21:102861] [[INVALID],INVALID] ORTE_ERROR_LOG: Unable to start 
> a daemon on the local node in file 
> ../../../../../../orte/mca/ess/singleton/ess_singleton_module.c at line 716
> [ip-10-84-234-21:102861] [[INVALID],INVALID] ORTE_ERROR_LOG: Unable to start 
> a daemon on the local node in file 
> ../../../../../../orte/mca/ess/singleton/ess_singleton_module.c at line 172
> --------------------------------------------------------------------------
> It looks like orte_init failed for some reason; your parallel process is
> likely to abort.  There are many reasons that a parallel process can
> fail during orte_init; some of which are due to configuration or
> environment problems.  This failure appears to be an internal failure;
> here's some additional information (which may only be relevant to an
> Open MPI developer):
> 
>   orte_ess_init failed
>   --> Returned value Unable to start a daemon on the local node (-127) 
> instead of ORTE_SUCCESS
> --------------------------------------------------------------------------
> --------------------------------------------------------------------------
> It looks like MPI_INIT failed for some reason; your parallel process is
> likely to abort.  There are many reasons that a parallel process can
> fail during MPI_INIT; some of which are due to configuration or environment
> problems.  This failure appears to be an internal failure; here's some
> additional information (which may only be relevant to an Open MPI
> developer):
> 
>   ompi_mpi_init: ompi_rte_init failed
>   --> Returned "Unable to start a daemon on the local node" (-127) instead of 
> "Success" (0)
> --------------------------------------------------------------------------
> *** An error occurred in MPI_Init_thread
> *** on a NULL communicator
> *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
> ***    and potentially your MPI job)
> [ip-10-84-234-21:102861] Local abort before MPI_INIT completed completed 
> successfully, but am not able to aggregate error messages, and not able to 
> guarantee that all other processes were killed!
> E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_mpi4py-fft/build; python3.12 -m 
> unittest discover -v 
> I: pybuild base:311: cd 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_mpi4py-fft/build; python3.11 -m 
> unittest discover -v 
> --------------------------------------------------------------------------
> Sorry!  You were supposed to get help about:
>     pmix_init:startup:internal-failure
> But I couldn't open the help file:
>     /usr/share/pmix/help-pmix-runtime.txt: No such file or directory.  Sorry!
> --------------------------------------------------------------------------
> [ip-10-84-234-21:102865] PMIX ERROR: NOT-FOUND in file 
> ../../../../../../../../opal/mca/pmix/pmix3x/pmix/src/server/pmix_server.c at 
> line 237
> [ip-10-84-234-21:102864] [[INVALID],INVALID] ORTE_ERROR_LOG: Unable to start 
> a daemon on the local node in file 
> ../../../../../../orte/mca/ess/singleton/ess_singleton_module.c at line 716
> [ip-10-84-234-21:102864] [[INVALID],INVALID] ORTE_ERROR_LOG: Unable to start 
> a daemon on the local node in file 
> ../../../../../../orte/mca/ess/singleton/ess_singleton_module.c at line 172
> --------------------------------------------------------------------------
> It looks like orte_init failed for some reason; your parallel process is
> likely to abort.  There are many reasons that a parallel process can
> fail during orte_init; some of which are due to configuration or
> environment problems.  This failure appears to be an internal failure;
> here's some additional information (which may only be relevant to an
> Open MPI developer):
> 
>   orte_ess_init failed
>   --> Returned value Unable to start a daemon on the local node (-127) 
> instead of ORTE_SUCCESS
> --------------------------------------------------------------------------
> --------------------------------------------------------------------------
> It looks like MPI_INIT failed for some reason; your parallel process is
> likely to abort.  There are many reasons that a parallel process can
> fail during MPI_INIT; some of which are due to configuration or environment
> problems.  This failure appears to be an internal failure; here's some
> additional information (which may only be relevant to an Open MPI
> developer):
> 
>   ompi_mpi_init: ompi_rte_init failed
>   --> Returned "Unable to start a daemon on the local node" (-127) instead of 
> "Success" (0)
> --------------------------------------------------------------------------
> *** An error occurred in MPI_Init_thread
> *** on a NULL communicator
> *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
> ***    and potentially your MPI job)
> [ip-10-84-234-21:102864] Local abort before MPI_INIT completed completed 
> successfully, but am not able to aggregate error messages, and not able to 
> guarantee that all other processes were killed!
> E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_mpi4py-fft/build; python3.11 -m 
> unittest discover -v 
> dh_auto_test: error: pybuild --test -i python{version} -p "3.12 3.11" 
> returned exit code 13


The full build log is available from:
http://qa-logs.debian.net/2024/04/20/mpi4py-fft_2.0.5-2_unstable-armhf.log

All bugs filed during this archive rebuild are listed at:
https://bugs.debian.org/cgi-bin/pkgreport.cgi?tag=ftbfs-20240420;users=lu...@debian.org
or:
https://udd.debian.org/bugs/?release=na&merged=ign&fnewerval=7&flastmodval=7&fusertag=only&fusertagtag=ftbfs-20240420&fusertaguser=lu...@debian.org&allbugs=1&cseverity=1&ctags=1&caffected=1#results

A list of current common problems and possible solutions is available at
http://wiki.debian.org/qa.debian.org/FTBFS . You're welcome to contribute!

If you reassign this bug to another package, please mark it as 'affects'-ing
this package. See https://www.debian.org/Bugs/server-control#affects

If you fail to reproduce this, please provide a build log and diff it with mine
so that we can identify if something relevant changed in the meantime.

--- End Message ---
--- Begin Message --- This is an error in openmpi not mpi4py-fft. This specific error is now fixed in openmpi.

Closing since not a bug in mpi4py-fft.

--- End Message ---

Reply via email to