Package: src:python-av Version: 16.1.0+ds-1 Severity: serious Tags: ftbfs forky sid
Dear maintainer: During a rebuild of all packages in unstable, this package failed to build. Below you will find the last part of the build log (probably the most relevant part, but not necessarily). If required, the full build log is available here: https://people.debian.org/~sanvila/build-logs/202601/ About the archive rebuild: The build was made on virtual machines from AWS, using sbuild and a reduced chroot with only build-essential packages. If you cannot reproduce the bug please contact me privately, as I am willing to provide ssh access to a virtual machine where the bug is fully reproducible. If this is really a bug in one of the build-depends, please use reassign and add an affects on src:python-av, so that this is still visible in the BTS web page for this package. Thanks. -------------------------------------------------------------------------------- [...] debian/rules clean dh clean dh_testdir dh_auto_clean dh_autoreconf_clean dh_clean debian/rules binary dh binary dh_testdir dh_update_autotools_config dh_autoreconf dh_auto_configure dh_auto_build I: pybuild plugin_pyproject:139: Building wheel for python3.14 with "build" module I: pybuild base:384: python3.14 -m build --skip-dependency-check --no-isolation --wheel --outdir /<<PKGBUILDDIR>>/.pybuild/cpython3_3.14 [... snipped ...] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ av/datasets.py:108: in fate return cached_download( av/datasets.py:75: in cached_download response = urlopen(url) ^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f6cfbb5ecf0> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f6cfaf48510>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f6cfaf48770> headers = {'Connection': 'close', 'Host': 'fate.ffmpeg.org', 'User-Agent': 'Python-urllib/3.13'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ TestSubtitle.test_subtitle_header_read ____________________ self = <urllib.request.HTTPHandler object at 0x7f6cfbb5ecf0> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f6cfaf4a3f0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f6cfaf4a520> headers = {'Connection': 'close', 'Host': 'fate.ffmpeg.org', 'User-Agent': 'Python-urllib/3.13'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1358: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1404: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1353: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1113: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1057: in send self.connect() /usr/lib/python3.13/http/client.py:1023: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f6d0e1293c0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <tests.test_subtitles.TestSubtitle object at 0x7f6cfbb9ca70> def test_subtitle_header_read(self) -> None: """Test reading subtitle_header from a decoded subtitle stream.""" > path = fate_suite("sub/MovText_capability_tester.mp4") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ tests/test_subtitles.py:82: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ av/datasets.py:108: in fate return cached_download( av/datasets.py:75: in cached_download response = urlopen(url) ^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f6cfbb5ecf0> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f6cfaf4a3f0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f6cfaf4a520> headers = {'Connection': 'close', 'Host': 'fate.ffmpeg.org', 'User-Agent': 'Python-urllib/3.13'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError =========================== short test summary info ============================ FAILED tests/test_decode.py::TestDecode::test_decoded_video_enc_params - urllib.error.URLError: <urlopen error [Errno 111] Connection refused> FAILED tests/test_decode.py::TestDecode::test_decoded_video_enc_params_no_flag - urllib.error.URLError: <urlopen error [Errno 111] Connection refused> FAILED tests/test_packet.py::TestPacketSideData::test_buffer_protocol - urllib.error.URLError: <urlopen error [Errno 111] Connection refused> FAILED tests/test_packet.py::TestPacketSideData::test_skip_samples_remux - urllib.error.URLError: <urlopen error [Errno 111] Connection refused> FAILED tests/test_subtitles.py::TestSubtitle::test_subtitle_header_read - urllib.error.URLError: <urlopen error [Errno 111] Connection refused> =========== 5 failed, 210 passed, 7 skipped, 92 deselected in 4.19s ============ E: pybuild pybuild:483: test: plugin pyproject failed with: [too-long-redacted] ign and not test_encoding_h264' dh_auto_test: error: pybuild --test --test-pytest -i python{version} -p "3.14 3.13" returned exit code 13 make: *** [debian/rules:96: binary] Error 25 dpkg-buildpackage: error: debian/rules binary subprocess failed with exit status 2 --------------------------------------------------------------------------------

